Deep Learning for Neanderthal Introgression

Nikolay Oskolkov, SciLifeLab, NBIS Long Term Support, nikolay.oskolkov@scilifelab.se

Abstract

In this notebook, I will demonstrate how we can use Deep Learning for calling regions of Neanderthal introgression in modern humans. We mostly are going to use the methodology developed within the Natural Language Processing (NLP) framework and apply it to ancient and modern DNA considering the DNA sequence as a biological molecular text. However first we start with a simple Sentiment Analysis which is Gene vs. Non-Gene regions classification.

Table of Contents:

Neanderthal DNA in Modern Human Genomes is Not Silent

Since the revolutionary draft of Neanderthal genome https://science.sciencemag.org/content/328/5979/710 paper, a few studies had been carried out to systematically establish the maps of regions of Neanderthal introgression into modern humans. Those regions had were hypothesized to have multiple functional effects related to skin color, male fertility etc.

In [1]:
from IPython.display import Image
Path = '/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/'
Image(Path + 'neanderthal_dna.png', width=2000)
Out[1]:

The current dominating hypothesis of human evolution is that anatomically modern humans originated in Africa and migrated to Europe and Asia approximately ~ 50 000 years ago. In Aeurope and Asia they met Neanderthals and Denisovans, who the latter ones happened to come to Europe and Asia is not clear. We know that modern humans interbred with Neanderthals and Denisovans and had common offsprings. Modern humans of non-African ancestry have estimated fraction of 2%-5% of Neanderthal DNA. This became know in 2010 when the group of Svante Pääbo sequenced draft Neanderthal genome.

In [2]:
from IPython.display import Image
Path = '/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/'
Image(Path + 'BriefHistory.png', width=2000)
Out[2]:

Since the sequencing of the draft Neanderthal genome there were a few studies that systematically attempted to detect regions of Neanderthal introgression in modern human genomes using mostly individuals from the 1000G project:

All the methods of detecting regions of Neanderthal introgression are based on camparing a test modern genome of European or Asian ancestry with the high-coverage Neanderthal and Denisovan genomes one the one hand and with sub-Saharan African (Yoroba) genome on the other hand. The candidate regions should be as similar as possible to the Neanderthal / Denisovan DNA and as divergent as possible from the African genome. Usually a test statistic such as Conditional Random Field (CRF), S* or Hidden Markov Model (HMM) are used in sliding window across the whole test genome. The disadvantage of those methods is their memoryless nature, i.e. no memory about nucleotide sequence is kept in the model. Here we will develop a Deep Learning based method for detecting regions of Neanderthal introgression in modern human genomes. The advantage of this approach is the long memory about the nucleotide sequence which possibly will bring new interesting candidate regions compared to the previous methods.

Gene vs. Non-Gene Sequence Classification

Before diving into Deep Learning modeles for calling regions of Neanderthal introgression in modern genomes, let us perform a simpler experiment and try to classify sequences that belong to either gene or intergenic regions. Since both Reich and Akey did their 1000G Neanderthal Introgression calls using hg19 version of human reference genome, we downloaded the hg19 version from http://hgdownload.cse.ucsc.edu/goldenPath/hg19/bigZips/ and prepared it for fast sequence extraction with samtools faidx. Next, we are going to build an annotation file for protein-coding genes. We downloaded the RefSeq annotation file from http://genome.ucsc.edu/cgi-bin/hgTables for hg19 as a text-file and used "genePredToGtf" tool to build the refGene_hg19.gtf gtf-annotation file. The gtf-annotation looks messy, it includes both gene and exon annotation, we will select only gene annotation with the keyword "transcript" in the third column.

In [1]:
%%bash
cd /home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes
awk -v OFS='\t' '{if($3=="transcript") print $1,$4,$5,$10}' refGene_hg19.gtf | \
tr -d '\"' | sed 's/\;//g' | sort | uniq > gene_coords.txt
echo
head gene_coords.txt
echo
wc -l gene_coords.txt
chr10	100007443	100028007	LOXL4
chr10	100143322	100174978	PYROXD2
chr10	100154975	100155064	MIR1287
chr10	100175955	100206720	HPS1
chr10	100188903	100206720	HPS1
chr10	100191049	100191117	MIR4685
chr10	100206078	100213562	LOC101927278
chr10	100216834	100995632	HPSE2
chr10	100684256	100684325	MIR6507
chr10	10100685	10105465	LOC101928298

40996 gene_coords.txt

We can see that we have 40996 RefSeq genes which sounds a lot as we know that human genome has approximately ~20 000 protein coding genes. This descrepancy is explained by the non-uniqueness of the gene symbols in the last column. You can see that there are at least two HPS1 genes with different coordinates. This is a knwon fact, that is why they have Ensembl gene IDs which are unique, so that one gene symbol can correcspond to multiple Ensembl IDs.

In [2]:
%%bash
cd /home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes
cut -f4 gene_coords.txt | sort | uniq | wc -l
27565

If we check how many unique gene symbols we have, we get 27 565 genes which is closer to the number of protein coding genes in human genome. Further, if we exclude non-coding RNA and LOC-genes, we get almost the correct number of protein coding genes.

In [3]:
%%bash
cd /home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes
cut -f4 gene_coords.txt | sort | uniq | grep -v "LOC" | grep -v "MIR" | grep -v "LINC" | wc -l
22432

For now we are going to keep all the 40 996 genes assuming that they all have unique Ensembl ID even though they might be duplicates in sense of gene symbol. Now we are going to read the gene coordinates into Python and plot the distribution of gene lengths.

In [4]:
import pandas as pd
Path = '/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/'
gene_coords = pd.read_csv(Path + 'gene_coords.txt', header=None, sep="\t")
gene_coords.sort_values(by = [0, 1, 2], inplace = True)
gene_coords.to_csv(Path + 'gene_coords.txt', index = False, header = False, sep = '\t')
gene_coords.head()
Out[4]:
0 1 2 3
2596 chr1 11874 14409 DDX11L1
3030 chr1 14362 29370 WASH7P
4909 chr1 17369 17436 MIR6859-1
4910 chr1 17369 17436 MIR6859-2
4911 chr1 17369 17436 MIR6859-3
In [5]:
gene_coords.shape
Out[5]:
(40996, 4)
In [7]:
import seaborn as sns
import matplotlib.pyplot as plt
plt.figure(figsize=(20,15))
gene_lengths = gene_coords.iloc[:, 2]-gene_coords.iloc[:, 1]
sns.distplot(gene_lengths)
plt.title("Distribution of Gene Lengths", fontsize = 20)
plt.xlabel("Lengths of Genes", fontsize = 20)
plt.ylabel("Frequency", fontsize = 20)
plt.show()
In [8]:
from scipy import stats
print(stats.describe(gene_lengths))
DescribeResult(nobs=40996, minmax=(19, 2320933), mean=52554.41474778027, variance=12473868428.7287, skewness=6.621632083720264, kurtosis=72.16643584088652)

We can see that the lengths of genes vary from minimal length 19 bp up to 2.3 Mbp with the mean length 52 kbp. These numbers are good to keep in mind for further downstream analysis. Now we are going to use the coordinates of the genes and the human reference genome hg19 fasta-file which we downloaded from http://hgdownload.cse.ucsc.edu/goldenPath/hg19/bigZips/ in oder to extract the sequences of the genes using the samtools. One can of course do it in Python, but samtools is much faster, so we will use samtools here, but run it from Python.

In [9]:
import os
import subprocess
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/')
with open('hg19_gene_regions.fa', 'a') as fp:
    for i in range(gene_coords.shape[0]):
        coord = str(str(gene_coords.iloc[i, 0]) + ':' 
                    + str(gene_coords.iloc[i, 1]) + '-' + str(gene_coords.iloc[i, 2]))
        subprocess.run(['samtools', 'faidx', 'hg19.fa.gz', str(coord)], stdout = fp)

Now we have a fasta-file with 40 996 nucleotide sequences of genes from hg19 human genome. Since we want to learn DNA motifs that discriminate between genes and non-genes, we need to build another fasta-file with 40 996 sequences of non-gene (intergenic) regions, each of them is of exactly the same length as the corresponding gene regions. To do this, we will need to know the lengths of each chromosome in order to randomly draw intergenic regions. When indexing the hg19 reference genome, samtools created a fai-file which contains the lengths of the cromosomes in the second column, we will read and sort it:

In [10]:
chr_sizes = pd.read_csv("hg19.fa.gz.fai", header = None, sep = "\t")
chr_sizes = chr_sizes.drop([2, 3, 4], axis = 1)
#chr_sizes = chr_sizes[chr_sizes[0].isin(['chr' + str(i) for i in list(range(1,23)) + ['X', 'Y']])]
#sex_chr_sizes = chr_sizes[chr_sizes[0].str.match('|'.join(['chrX','chrY']))]
#chr_sizes = chr_sizes.drop(chr_sizes[chr_sizes[0] == 'chrX'].index, axis = 0)
#chr_sizes = chr_sizes.drop(chr_sizes[chr_sizes[0] == 'chrY'].index, axis = 0)
#chr_sizes['temp'] = chr_sizes[0].str.split('chr').str[1]
#chr_sizes['temp'] = chr_sizes['temp'].astype(int)
#chr_sizes = chr_sizes.sort_values('temp')
#chr_sizes = chr_sizes.drop('temp', axis = 1)
#chr_sizes = chr_sizes.append(sex_chr_sizes)
chr_sizes.head()
Out[10]:
0 1
0 chr1 249250621
1 chr2 243199373
2 chr3 198022430
3 chr4 191154276
4 chr5 180915260

Now for each gene in the gene_coords DataFrame, we are going to randomly draw a region of the same length as the gene on the same chromosome and check whether this region overlaps with any other gene (not only with this one) on the same chromosome. If it does not, we will add this region to the notgene_coords DataFrame. If it does overlap, we repeat the random drawing of the region gain and again until we succeeed in selecting a truly intergenic region of the same length as the given gene.

In [11]:
import numpy as np
chr_list = []
start_list = []
end_list = []
gene_lengths = list(gene_coords.iloc[:, 2] - gene_coords.iloc[:, 1])
a = 0
for i in range(gene_coords.shape[0]):
    chr_df = gene_coords[gene_coords[0].isin([gene_coords.iloc[i,0]])]
    overlap = True
    while overlap == True:
        reg_start = np.random.randint(1, int(chr_sizes[chr_sizes[0] == gene_coords.iloc[i,0]].iloc[:,1]))
        reg_end = reg_start + gene_lengths[i]
        for j in range(chr_df.shape[0]):
            b1 = chr_df.iloc[j,1]
            b2 = chr_df.iloc[j,2]
            if (reg_start > b1 and reg_start < b2) or (reg_end > b1 and reg_end < b2) or \
            (b1 > reg_start and b1 < reg_end) or (b2 > reg_start and b2 < reg_end):
                overlap = True
                break
            else:
                overlap = False
    chr_list.append(gene_coords.iloc[i,0])
    start_list.append(reg_start)
    end_list.append(reg_end)
    a = a + 1
    if a%10000 == 0:
            print('Finished ' + str(a) + ' genes')
notgene_coords = pd.DataFrame({'0': chr_list, '1': start_list, '2': end_list})
notgene_coords.to_csv("notgene_coords.txt", index = False, header = False, sep = "\t")
notgene_coords.head()
Finished 10000 genes
Finished 20000 genes
Finished 30000 genes
Finished 40000 genes
Out[11]:
0 1 2
0 chr1 122614392 122616927
1 chr1 14589063 14604071
2 chr1 113873155 113873222
3 chr1 131976299 131976366
4 chr1 149163069 149163136

Let us now conform that the gene and intergenic regions indeed do not overlap. We will use bedtools intersect for this purpose. This command returns coordinate of interesects between two sets of genetic regions. If it does not return anything, it implies there are no intersects between the gene and intergenic regions, as we want to have it.

In [12]:
!bedtools intersect -a gene_coords.txt -b notgene_coords.txt | wc -l
0

We get zero intersects meaning that we indeed managed to build a set of intergenic regions that do not overlap with the gene regions. Now we are going to use the set of intergenic regions and extract the sequences of hg19 human reference genome corresponding to the intergenic coordinates. We wgain will use samtools for this purpose.

In [13]:
import os
import subprocess
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/')
with open('hg19_notgene_regions.fa', 'a') as fp:
    for i in range(notgene_coords.shape[0]):
        coord = str(str(notgene_coords.iloc[i, 0]) + ':' 
                    + str(notgene_coords.iloc[i, 1]) + '-' + str(notgene_coords.iloc[i, 2]))
        subprocess.run(['samtools', 'faidx', 'hg19.fa.gz', str(coord)], stdout = fp)

Everything looks good at first glance, howeever taking a closer look we can see that we have quite a few sequences with N-nulceotides. This is a kind of missing data since N implies nucleotides that could not be identified by the sequencer. A fast way to see the presense of N-nucleoties is to grep them.

In [14]:
!grep -c N hg19_gene_regions.fa
118075
In [15]:
!grep -c N hg19_notgene_regions.fa
8991350

However we know that samtools splits sequences into multiple lines, so grep does not actually detect the number of N-containing entries but the number of N-containing lines. In order to be more precise, we can quickly count using Bio Python the numbers of entries in each fasta-file that contain at least one N-nucleotide.

In [16]:
from Bio import SeqIO

i = 0
for record in SeqIO.parse('hg19_gene_regions.fa', 'fasta'):
    upper_record = record.seq.upper()
    if 'N' in upper_record:
        i = i + 1
print('Gene regions file contains ' + str(i) + ' entries with at least one N-nucleotide')

j = 0
for record in SeqIO.parse('hg19_notgene_regions.fa', 'fasta'):
    upper_record = record.seq.upper()
    if 'N' in upper_record:
        j = j + 1
print('Intergenic regions file contains ' + str(j) + ' entries with at least one N-nucleotide')
Gene regions file contains 131 entries with at least one N-nucleotide
Intergenic regions file contains 7123 entries with at least one N-nucleotide

In case fractions of missing data are different between gene and intergenic regions, this might be treated as a signal by Convolutional Neural Networks (CNNs), and this is not what we are interested in. Therefore for simplicity we will remove all entries containing N-nucleotides in either gene or intergenic regions. For example, if a gene region contains at least one N-nucleotides, we will drop the region together with the corresponding intergenic region despite that one might not contain N-nucleotides. This is needed for a better correspondence between the gene and intergenic fasta-files.

In [17]:
from Bio import SeqIO

gene_file = 'hg19_gene_regions.fa'
notgene_file = 'hg19_notgene_regions.fa'
a = 0
i = 0
with open('hg19_gene_clean.fa', 'a') as gene_out, open('hg19_notgene_clean.fa', 'a') as notgene_out:
    for gene, notgene in zip(SeqIO.parse(gene_file, 'fasta'), SeqIO.parse(notgene_file, 'fasta')):
        upper_gene = gene.seq.upper()
        upper_notgene = notgene.seq.upper()
        a = a + 1
        if a%10000 == 0:
            print('Finished ' + str(a) + ' entries')
        if 'N' not in str(upper_gene) and 'N' not in str(upper_notgene):
            gene.seq = upper_gene
            SeqIO.write(gene, gene_out, 'fasta')
            notgene.seq = upper_notgene
            SeqIO.write(notgene, notgene_out, 'fasta')
            i = i + 1
        else:
            continue
print('We have processed ' + str(a) + ' entries and written ' + str(i) + ' entries to two fasta-files')
Finished 10000 entries
Finished 20000 entries
Finished 30000 entries
Finished 40000 entries
We have processed 40996 entries and written 33791 entries to two fasta-files

Thus we have removed approximately 7000 entries, but still have plenty of sequences left to run CNNs for gene vs. intergenic region classification. Now let us quickly check using grep whether we indeed are free of N-containing sequences.

In [18]:
!grep -c N hg19_gene_clean.fa
0
In [19]:
!grep -c N hg19_notgene_clean.fa
0

Looks good! Now it is time to start building the input matrix of sequences to be fed int othe CNN. Since I have limited memory on my laptop, I will not read all the sequences into memory but extract first cut = 500 nucleotides from each sequence. The gene and intergenic sequences of length less than cut = 500 nucleotides will be ignored and not included into the CNN input matrix. We also have to make sure that all 4 nucleotides are present at each sequence, i.e. we will omit e.g. AT or CG repeat sequences.

In [1]:
import os
from Bio import SeqIO

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/')

gene_file = 'hg19_gene_clean.fa'
notgene_file = 'hg19_notgene_clean.fa'

a = 0
gene_seqs = []
notgene_seqs = []
for gene, notgene in zip(SeqIO.parse(gene_file, 'fasta'), SeqIO.parse(notgene_file, 'fasta')):
    cut = 500
    if len(str(gene.seq)) < cut or len(str(notgene.seq)) < cut:
        continue
    s_gene = str(gene.seq)[0:cut]
    s_notgene = str(notgene.seq)[0:cut]
    if s_gene.count('A')>0 and s_gene.count('C')>0 and s_gene.count('G')>0 and s_gene.count('T')>0 and \
    s_notgene.count('A')>0 and s_notgene.count('C')>0 and s_notgene.count('G')>0 and s_notgene.count('T')>0:
        gene_seqs.append(s_gene)
        notgene_seqs.append(s_notgene)
    a = a + 1
    if a%10000 == 0:
        print('Finished ' + str(a) + ' entries')
Finished 10000 entries
Finished 20000 entries
Finished 30000 entries

Next, we will concatenate the gene and intergenic sequences into a data set. Checking the length of this large list of sequences, we can see that we have a fair number of statistical observations to run Convolutional Neural Networks (CNNs).

In [2]:
sequences = gene_seqs + notgene_seqs
len(sequences)
Out[2]:
62318

Here we prepare a list of sequence labels to be used in the CNN. We denote gene sequences as 1 and intergenic regions as 0. The length of this list of labels is of course equal to the length of the list of sequences above.

In [3]:
import numpy as np
labels = list(np.ones(len(gene_seqs))) + list(np.zeros(len(notgene_seqs)))
len(labels)
Out[3]:
62318

Now we need to one-hot-encode both sequences and lables for a correct input to CNN. We will use scikitlearn classes LabelEncoder and OneHotEncoder. The LabelEncoder encodes a sequence of bases as a sequence of integers: 0, 1, 2 and 3. The OneHotEncoder converts an array of integers to a sparse matrix where each row corresponds to one possible value of each feature, i.e. only 01 and 1 are present in the matrix.

In [4]:
from sklearn.preprocessing import LabelEncoder, OneHotEncoder

import warnings
warnings.filterwarnings('ignore')

integer_encoder = LabelEncoder()  
one_hot_encoder = OneHotEncoder()   
input_features = []

for sequence in sequences:
  integer_encoded = integer_encoder.fit_transform(list(sequence))
  integer_encoded = np.array(integer_encoded).reshape(-1, 1)
  one_hot_encoded = one_hot_encoder.fit_transform(integer_encoded)
  input_features.append(one_hot_encoded.toarray())

np.set_printoptions(threshold = 40)
#print(input_features.shape)
input_features = np.stack(input_features)
print("Example sequence\n-----------------------")
print('DNA Sequence #1:\n',sequences[0][:10],'...',sequences[0][-10:])
print('One hot encoding of Sequence #1:\n',input_features[0].T)
Example sequence
-----------------------
DNA Sequence #1:
 TCCTGCACAG ... GGGTGGTTGG
One hot encoding of Sequence #1:
 [[0. 0. 0. ... 0. 0. 0.]
 [0. 1. 1. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 1. 1.]
 [1. 0. 0. ... 1. 0. 0.]]
In [5]:
one_hot_encoder = OneHotEncoder()
labels = np.array(labels).reshape(-1, 1)
input_labels = one_hot_encoder.fit_transform(labels).toarray()

print('Labels:\n',labels.T)
print('One-hot encoded labels:\n',input_labels.T)
Labels:
 [[1. 1. 1. ... 0. 0. 0.]]
One-hot encoded labels:
 [[0. 0. 0. ... 1. 1. 1.]
 [1. 1. 1. ... 0. 0. 0.]]

Next is the standard step of splitting the data set into training and test sub-sets. The latter will be used for the final evaluation of the model. Please note that the one-hot-encoded data set of sequences is three dimensional array analagous to how images are encoded for image recognition problems. First number in the array is the amount of statistical observations (sequences), second is the dimensionality of the data (length of the sequences) equal to the cut variable, and the third number denotes the numberof channels i.e. nucleotides (4 possible nucleotides, i.e. A, C, G and T). Overall the data looks like a 1D image, a proper 2D image would be represented by a four-dimensional array, where second and third dimensions would correspond to the height and width (in pixels) of the 2D images, the fourth dimension would be the number of color channels, e.g. R, G and B.

In [6]:
from sklearn.model_selection import train_test_split

train_features, test_features, train_labels, test_labels = train_test_split(
    input_features, input_labels, test_size = 0.2, random_state = 42)
In [7]:
train_features.shape
Out[7]:
(49854, 500, 4)
In [8]:
train_labels.shape
Out[8]:
(49854, 2)
In [9]:
test_features.shape
Out[9]:
(12464, 500, 4)
In [10]:
test_labels.shape
Out[10]:
(12464, 2)

Finally it is time to contract a simple shallow Convolutional Neural Network (CNN) and start training it. We construct quite a shallow one-block VGG-like CNN, i.e. two consequent 1D convolutional layers followed by a 1D max-pooling layer. Light dropout regularization is used to prevent overfitting.

In [14]:
from keras.optimizers import SGD, Adam, Adadelta
from keras.layers import Conv1D, Dense, MaxPooling1D, Flatten, Dropout, Embedding, Activation
from keras.models import Sequential
from keras.regularizers import l1

import warnings
warnings.filterwarnings('ignore')

model = Sequential()

model.add(Conv1D(filters = 16, kernel_size = 5, padding = 'same', kernel_initializer = 'he_uniform', 
                 input_shape = (train_features.shape[1], 4)))
model.add(Activation("relu"))
model.add(Conv1D(filters = 16, kernel_size = 5, padding = 'same', kernel_initializer = 'he_uniform'))
model.add(Activation("relu"))
model.add(MaxPooling1D(pool_size = 2))
model.add(Dropout(0.3))

model.add(Flatten())
model.add(Dense(8, kernel_initializer = 'he_uniform'))
model.add(Activation("relu"))
model.add(Dropout(0.4))
model.add(Dense(2, activation = 'softmax'))

epochs = 100
lrate = 0.01
decay = lrate / epochs
sgd = SGD(lr = lrate, momentum = 0.9, decay = decay, nesterov = False)
model.compile(loss = 'binary_crossentropy', optimizer = sgd, metrics = ['binary_accuracy'])
#model.compile(loss='binary_crossentropy', optimizer=Adam(lr = lrate), metrics=['binary_accuracy'])
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv1d_5 (Conv1D)            (None, 500, 16)           336       
_________________________________________________________________
activation_7 (Activation)    (None, 500, 16)           0         
_________________________________________________________________
conv1d_6 (Conv1D)            (None, 500, 16)           1296      
_________________________________________________________________
activation_8 (Activation)    (None, 500, 16)           0         
_________________________________________________________________
max_pooling1d_3 (MaxPooling1 (None, 250, 16)           0         
_________________________________________________________________
dropout_5 (Dropout)          (None, 250, 16)           0         
_________________________________________________________________
flatten_3 (Flatten)          (None, 4000)              0         
_________________________________________________________________
dense_5 (Dense)              (None, 8)                 32008     
_________________________________________________________________
activation_9 (Activation)    (None, 8)                 0         
_________________________________________________________________
dropout_6 (Dropout)          (None, 8)                 0         
_________________________________________________________________
dense_6 (Dense)              (None, 2)                 18        
=================================================================
Total params: 33,658
Trainable params: 33,658
Non-trainable params: 0
_________________________________________________________________
In [15]:
import warnings
warnings.filterwarnings('ignore')

history = model.fit(train_features, train_labels, 
                    epochs = epochs, verbose = 1, validation_split = 0.2, batch_size = 32, shuffle = True)
Train on 39883 samples, validate on 9971 samples
Epoch 1/100
39883/39883 [==============================] - 16s 396us/step - loss: 0.5448 - binary_accuracy: 0.7218 - val_loss: 0.4346 - val_binary_accuracy: 0.8115
Epoch 2/100
39883/39883 [==============================] - 17s 427us/step - loss: 0.4601 - binary_accuracy: 0.7944 - val_loss: 0.4137 - val_binary_accuracy: 0.8270
Epoch 3/100
39883/39883 [==============================] - 17s 429us/step - loss: 0.4436 - binary_accuracy: 0.8078 - val_loss: 0.3960 - val_binary_accuracy: 0.8383
Epoch 4/100
39883/39883 [==============================] - 17s 429us/step - loss: 0.4355 - binary_accuracy: 0.8152 - val_loss: 0.3973 - val_binary_accuracy: 0.8366
Epoch 5/100
39883/39883 [==============================] - 17s 427us/step - loss: 0.4220 - binary_accuracy: 0.8216 - val_loss: 0.3915 - val_binary_accuracy: 0.8402
Epoch 6/100
39883/39883 [==============================] - 17s 432us/step - loss: 0.4157 - binary_accuracy: 0.8249 - val_loss: 0.3852 - val_binary_accuracy: 0.8442
Epoch 7/100
39883/39883 [==============================] - 17s 432us/step - loss: 0.4118 - binary_accuracy: 0.8290 - val_loss: 0.3881 - val_binary_accuracy: 0.8379
Epoch 8/100
39883/39883 [==============================] - 18s 452us/step - loss: 0.4091 - binary_accuracy: 0.8302 - val_loss: 0.3788 - val_binary_accuracy: 0.8461
Epoch 9/100
39883/39883 [==============================] - 17s 428us/step - loss: 0.4035 - binary_accuracy: 0.8321 - val_loss: 0.3865 - val_binary_accuracy: 0.8472
Epoch 10/100
39883/39883 [==============================] - 17s 429us/step - loss: 0.3989 - binary_accuracy: 0.8350 - val_loss: 0.3908 - val_binary_accuracy: 0.8348
Epoch 11/100
39883/39883 [==============================] - 17s 431us/step - loss: 0.3992 - binary_accuracy: 0.8358 - val_loss: 0.3790 - val_binary_accuracy: 0.8460
Epoch 12/100
39883/39883 [==============================] - 17s 436us/step - loss: 0.3935 - binary_accuracy: 0.8374 - val_loss: 0.3760 - val_binary_accuracy: 0.8467
Epoch 13/100
39883/39883 [==============================] - 18s 442us/step - loss: 0.3914 - binary_accuracy: 0.8393 - val_loss: 0.3771 - val_binary_accuracy: 0.8450
Epoch 14/100
39883/39883 [==============================] - 18s 439us/step - loss: 0.3911 - binary_accuracy: 0.8397 - val_loss: 0.3899 - val_binary_accuracy: 0.8355
Epoch 15/100
39883/39883 [==============================] - 17s 438us/step - loss: 0.3849 - binary_accuracy: 0.8408 - val_loss: 0.3813 - val_binary_accuracy: 0.8417
Epoch 16/100
39883/39883 [==============================] - 18s 439us/step - loss: 0.3845 - binary_accuracy: 0.8448 - val_loss: 0.3788 - val_binary_accuracy: 0.8442
Epoch 17/100
39883/39883 [==============================] - 18s 450us/step - loss: 0.3823 - binary_accuracy: 0.8454 - val_loss: 0.3723 - val_binary_accuracy: 0.8463
Epoch 18/100
39883/39883 [==============================] - 18s 456us/step - loss: 0.3780 - binary_accuracy: 0.8441 - val_loss: 0.3733 - val_binary_accuracy: 0.8474
Epoch 19/100
39883/39883 [==============================] - 18s 443us/step - loss: 0.3734 - binary_accuracy: 0.8472 - val_loss: 0.3736 - val_binary_accuracy: 0.8459
Epoch 20/100
39883/39883 [==============================] - 18s 440us/step - loss: 0.3691 - binary_accuracy: 0.8481 - val_loss: 0.3685 - val_binary_accuracy: 0.8474
Epoch 21/100
39883/39883 [==============================] - 18s 439us/step - loss: 0.3661 - binary_accuracy: 0.8482 - val_loss: 0.3678 - val_binary_accuracy: 0.8461
Epoch 22/100
39883/39883 [==============================] - 18s 458us/step - loss: 0.3619 - binary_accuracy: 0.8511 - val_loss: 0.3738 - val_binary_accuracy: 0.8403
Epoch 23/100
39883/39883 [==============================] - 18s 453us/step - loss: 0.3602 - binary_accuracy: 0.8515 - val_loss: 0.3725 - val_binary_accuracy: 0.8490
Epoch 24/100
39883/39883 [==============================] - 18s 447us/step - loss: 0.3560 - binary_accuracy: 0.8544 - val_loss: 0.3675 - val_binary_accuracy: 0.8491
Epoch 25/100
39883/39883 [==============================] - 18s 442us/step - loss: 0.3587 - binary_accuracy: 0.8517 - val_loss: 0.3688 - val_binary_accuracy: 0.8463
Epoch 26/100
39883/39883 [==============================] - 18s 461us/step - loss: 0.3550 - binary_accuracy: 0.8523 - val_loss: 0.3679 - val_binary_accuracy: 0.8473
Epoch 27/100
39883/39883 [==============================] - 18s 459us/step - loss: 0.3550 - binary_accuracy: 0.8530 - val_loss: 0.3676 - val_binary_accuracy: 0.8472
Epoch 28/100
39883/39883 [==============================] - 18s 445us/step - loss: 0.3530 - binary_accuracy: 0.8535 - val_loss: 0.3683 - val_binary_accuracy: 0.8501
Epoch 29/100
39883/39883 [==============================] - 18s 457us/step - loss: 0.3518 - binary_accuracy: 0.8541 - val_loss: 0.3717 - val_binary_accuracy: 0.8510
Epoch 30/100
39883/39883 [==============================] - 18s 441us/step - loss: 0.3463 - binary_accuracy: 0.8567 - val_loss: 0.3718 - val_binary_accuracy: 0.8492
Epoch 31/100
39883/39883 [==============================] - 18s 447us/step - loss: 0.3484 - binary_accuracy: 0.8555 - val_loss: 0.3719 - val_binary_accuracy: 0.8420
Epoch 32/100
39883/39883 [==============================] - 18s 442us/step - loss: 0.3488 - binary_accuracy: 0.8553 - val_loss: 0.3696 - val_binary_accuracy: 0.8491
Epoch 33/100
39883/39883 [==============================] - 17s 433us/step - loss: 0.3447 - binary_accuracy: 0.8575 - val_loss: 0.3773 - val_binary_accuracy: 0.8484
Epoch 34/100
39883/39883 [==============================] - 17s 436us/step - loss: 0.3448 - binary_accuracy: 0.8578 - val_loss: 0.3698 - val_binary_accuracy: 0.8477
Epoch 35/100
39883/39883 [==============================] - 18s 448us/step - loss: 0.3449 - binary_accuracy: 0.8575 - val_loss: 0.3717 - val_binary_accuracy: 0.8479
Epoch 36/100
39883/39883 [==============================] - 18s 452us/step - loss: 0.3433 - binary_accuracy: 0.8580 - val_loss: 0.3722 - val_binary_accuracy: 0.8526
Epoch 37/100
39883/39883 [==============================] - 18s 443us/step - loss: 0.3425 - binary_accuracy: 0.8590 - val_loss: 0.3776 - val_binary_accuracy: 0.8517
Epoch 38/100
39883/39883 [==============================] - 18s 445us/step - loss: 0.3432 - binary_accuracy: 0.8579 - val_loss: 0.3785 - val_binary_accuracy: 0.8512
Epoch 39/100
39883/39883 [==============================] - 18s 452us/step - loss: 0.3409 - binary_accuracy: 0.8577 - val_loss: 0.3756 - val_binary_accuracy: 0.8488
Epoch 40/100
39883/39883 [==============================] - 18s 456us/step - loss: 0.3402 - binary_accuracy: 0.8585 - val_loss: 0.3693 - val_binary_accuracy: 0.8500
Epoch 41/100
39883/39883 [==============================] - 18s 450us/step - loss: 0.3380 - binary_accuracy: 0.8591 - val_loss: 0.3796 - val_binary_accuracy: 0.8515
Epoch 42/100
39883/39883 [==============================] - 18s 444us/step - loss: 0.3364 - binary_accuracy: 0.8619 - val_loss: 0.3754 - val_binary_accuracy: 0.8485
Epoch 43/100
39883/39883 [==============================] - 18s 454us/step - loss: 0.3372 - binary_accuracy: 0.8620 - val_loss: 0.3772 - val_binary_accuracy: 0.8488
Epoch 44/100
39883/39883 [==============================] - 19s 471us/step - loss: 0.3349 - binary_accuracy: 0.8605 - val_loss: 0.3757 - val_binary_accuracy: 0.8472
Epoch 45/100
39883/39883 [==============================] - 18s 453us/step - loss: 0.3368 - binary_accuracy: 0.8605 - val_loss: 0.3724 - val_binary_accuracy: 0.8484
Epoch 46/100
39883/39883 [==============================] - 18s 453us/step - loss: 0.3362 - binary_accuracy: 0.8599 - val_loss: 0.3749 - val_binary_accuracy: 0.8435
Epoch 47/100
39883/39883 [==============================] - 19s 472us/step - loss: 0.3361 - binary_accuracy: 0.8603 - val_loss: 0.3715 - val_binary_accuracy: 0.8500
Epoch 48/100
39883/39883 [==============================] - 18s 461us/step - loss: 0.3343 - binary_accuracy: 0.8625 - val_loss: 0.3716 - val_binary_accuracy: 0.8505
Epoch 49/100
39883/39883 [==============================] - 18s 452us/step - loss: 0.3281 - binary_accuracy: 0.8608 - val_loss: 0.3772 - val_binary_accuracy: 0.8519
Epoch 50/100
39883/39883 [==============================] - 17s 428us/step - loss: 0.3283 - binary_accuracy: 0.8639 - val_loss: 0.3823 - val_binary_accuracy: 0.8530
Epoch 51/100
39883/39883 [==============================] - 17s 432us/step - loss: 0.3288 - binary_accuracy: 0.8628 - val_loss: 0.3748 - val_binary_accuracy: 0.8519
Epoch 52/100
39883/39883 [==============================] - 17s 425us/step - loss: 0.3273 - binary_accuracy: 0.8635 - val_loss: 0.3736 - val_binary_accuracy: 0.8474
Epoch 53/100
39883/39883 [==============================] - 17s 422us/step - loss: 0.3258 - binary_accuracy: 0.8625 - val_loss: 0.3826 - val_binary_accuracy: 0.8545
Epoch 54/100
39883/39883 [==============================] - 17s 438us/step - loss: 0.3248 - binary_accuracy: 0.8648 - val_loss: 0.3720 - val_binary_accuracy: 0.8501
Epoch 55/100
39883/39883 [==============================] - 17s 418us/step - loss: 0.3227 - binary_accuracy: 0.8659 - val_loss: 0.3769 - val_binary_accuracy: 0.8525
Epoch 56/100
39883/39883 [==============================] - 17s 423us/step - loss: 0.3256 - binary_accuracy: 0.8637 - val_loss: 0.3774 - val_binary_accuracy: 0.8504
Epoch 57/100
39883/39883 [==============================] - 17s 421us/step - loss: 0.3226 - binary_accuracy: 0.8633 - val_loss: 0.3731 - val_binary_accuracy: 0.8445
Epoch 58/100
39883/39883 [==============================] - 17s 428us/step - loss: 0.3197 - binary_accuracy: 0.8658 - val_loss: 0.3754 - val_binary_accuracy: 0.8500
Epoch 59/100
39883/39883 [==============================] - 17s 429us/step - loss: 0.3209 - binary_accuracy: 0.8636 - val_loss: 0.3744 - val_binary_accuracy: 0.8469
Epoch 60/100
39883/39883 [==============================] - 17s 438us/step - loss: 0.3190 - binary_accuracy: 0.8649 - val_loss: 0.3782 - val_binary_accuracy: 0.8494
Epoch 61/100
39883/39883 [==============================] - 17s 438us/step - loss: 0.3207 - binary_accuracy: 0.8651 - val_loss: 0.3769 - val_binary_accuracy: 0.8476
Epoch 62/100
39883/39883 [==============================] - 17s 429us/step - loss: 0.3151 - binary_accuracy: 0.8675 - val_loss: 0.3799 - val_binary_accuracy: 0.8508
Epoch 63/100
39883/39883 [==============================] - 17s 431us/step - loss: 0.3144 - binary_accuracy: 0.8669 - val_loss: 0.3800 - val_binary_accuracy: 0.8476
Epoch 64/100
39883/39883 [==============================] - 17s 430us/step - loss: 0.3135 - binary_accuracy: 0.8671 - val_loss: 0.3799 - val_binary_accuracy: 0.8459
Epoch 65/100
39883/39883 [==============================] - 18s 441us/step - loss: 0.3156 - binary_accuracy: 0.8677 - val_loss: 0.3800 - val_binary_accuracy: 0.8485
Epoch 66/100
39883/39883 [==============================] - 18s 445us/step - loss: 0.3142 - binary_accuracy: 0.8670 - val_loss: 0.3786 - val_binary_accuracy: 0.8459
Epoch 67/100
39883/39883 [==============================] - 18s 446us/step - loss: 0.3163 - binary_accuracy: 0.8662 - val_loss: 0.3818 - val_binary_accuracy: 0.8471
Epoch 68/100
39883/39883 [==============================] - 17s 435us/step - loss: 0.3152 - binary_accuracy: 0.8682 - val_loss: 0.3815 - val_binary_accuracy: 0.8505
Epoch 69/100
39883/39883 [==============================] - 17s 434us/step - loss: 0.3146 - binary_accuracy: 0.8664 - val_loss: 0.3794 - val_binary_accuracy: 0.8491
Epoch 70/100
39883/39883 [==============================] - 17s 429us/step - loss: 0.3129 - binary_accuracy: 0.8665 - val_loss: 0.3799 - val_binary_accuracy: 0.8464
Epoch 71/100
39883/39883 [==============================] - 18s 440us/step - loss: 0.3101 - binary_accuracy: 0.8673 - val_loss: 0.3800 - val_binary_accuracy: 0.8476
Epoch 72/100
39883/39883 [==============================] - 18s 446us/step - loss: 0.3153 - binary_accuracy: 0.8661 - val_loss: 0.3817 - val_binary_accuracy: 0.8490
Epoch 73/100
39883/39883 [==============================] - 17s 437us/step - loss: 0.3116 - binary_accuracy: 0.8700 - val_loss: 0.3832 - val_binary_accuracy: 0.8499
Epoch 74/100
39883/39883 [==============================] - 17s 433us/step - loss: 0.3113 - binary_accuracy: 0.8665 - val_loss: 0.3815 - val_binary_accuracy: 0.8492
Epoch 75/100
39883/39883 [==============================] - 17s 433us/step - loss: 0.3095 - binary_accuracy: 0.8698 - val_loss: 0.3880 - val_binary_accuracy: 0.8442
Epoch 76/100
39883/39883 [==============================] - 18s 441us/step - loss: 0.3072 - binary_accuracy: 0.8689 - val_loss: 0.3856 - val_binary_accuracy: 0.8496
Epoch 77/100
39883/39883 [==============================] - 18s 447us/step - loss: 0.3049 - binary_accuracy: 0.8709 - val_loss: 0.3825 - val_binary_accuracy: 0.8495
Epoch 78/100
39883/39883 [==============================] - 18s 450us/step - loss: 0.3118 - binary_accuracy: 0.8664 - val_loss: 0.3799 - val_binary_accuracy: 0.8503
Epoch 79/100
39883/39883 [==============================] - 18s 446us/step - loss: 0.3108 - binary_accuracy: 0.8673 - val_loss: 0.3825 - val_binary_accuracy: 0.8478
Epoch 80/100
39883/39883 [==============================] - 18s 442us/step - loss: 0.3117 - binary_accuracy: 0.8685 - val_loss: 0.3819 - val_binary_accuracy: 0.8483
Epoch 81/100
39883/39883 [==============================] - 18s 454us/step - loss: 0.3122 - binary_accuracy: 0.8680 - val_loss: 0.3842 - val_binary_accuracy: 0.8499
Epoch 82/100
39883/39883 [==============================] - 18s 456us/step - loss: 0.3113 - binary_accuracy: 0.8673 - val_loss: 0.3829 - val_binary_accuracy: 0.8480
Epoch 83/100
39883/39883 [==============================] - 18s 449us/step - loss: 0.3090 - binary_accuracy: 0.8698 - val_loss: 0.3816 - val_binary_accuracy: 0.8501
Epoch 84/100
39883/39883 [==============================] - 18s 449us/step - loss: 0.3107 - binary_accuracy: 0.8679 - val_loss: 0.3837 - val_binary_accuracy: 0.8499
Epoch 85/100
39883/39883 [==============================] - 18s 464us/step - loss: 0.3058 - binary_accuracy: 0.8703 - val_loss: 0.3854 - val_binary_accuracy: 0.8509
Epoch 86/100
39883/39883 [==============================] - 19s 467us/step - loss: 0.3083 - binary_accuracy: 0.8693 - val_loss: 0.3836 - val_binary_accuracy: 0.8475
Epoch 87/100
39883/39883 [==============================] - 18s 443us/step - loss: 0.3081 - binary_accuracy: 0.8705 - val_loss: 0.3837 - val_binary_accuracy: 0.8514
Epoch 88/100
39883/39883 [==============================] - 17s 431us/step - loss: 0.3061 - binary_accuracy: 0.8705 - val_loss: 0.3855 - val_binary_accuracy: 0.8503
Epoch 89/100
39883/39883 [==============================] - 17s 438us/step - loss: 0.3033 - binary_accuracy: 0.8718 - val_loss: 0.3860 - val_binary_accuracy: 0.8473
Epoch 90/100
39883/39883 [==============================] - 18s 439us/step - loss: 0.3082 - binary_accuracy: 0.8690 - val_loss: 0.3841 - val_binary_accuracy: 0.8481
Epoch 91/100
39883/39883 [==============================] - 18s 442us/step - loss: 0.3058 - binary_accuracy: 0.8698 - val_loss: 0.3857 - val_binary_accuracy: 0.8505
Epoch 92/100
39883/39883 [==============================] - 17s 436us/step - loss: 0.3022 - binary_accuracy: 0.8703 - val_loss: 0.3871 - val_binary_accuracy: 0.8515
Epoch 93/100
39883/39883 [==============================] - 17s 435us/step - loss: 0.3048 - binary_accuracy: 0.8711 - val_loss: 0.3837 - val_binary_accuracy: 0.8470
Epoch 94/100
39883/39883 [==============================] - 18s 440us/step - loss: 0.3043 - binary_accuracy: 0.8709 - val_loss: 0.3870 - val_binary_accuracy: 0.8501
Epoch 95/100
39883/39883 [==============================] - 18s 451us/step - loss: 0.3049 - binary_accuracy: 0.8696 - val_loss: 0.3823 - val_binary_accuracy: 0.8505
Epoch 96/100
39883/39883 [==============================] - 18s 447us/step - loss: 0.3013 - binary_accuracy: 0.8710 - val_loss: 0.3840 - val_binary_accuracy: 0.8494
Epoch 97/100
39883/39883 [==============================] - 18s 441us/step - loss: 0.3081 - binary_accuracy: 0.8703 - val_loss: 0.3829 - val_binary_accuracy: 0.8520
Epoch 98/100
39883/39883 [==============================] - 18s 441us/step - loss: 0.3043 - binary_accuracy: 0.8707 - val_loss: 0.3838 - val_binary_accuracy: 0.8525
Epoch 99/100
39883/39883 [==============================] - 18s 456us/step - loss: 0.3081 - binary_accuracy: 0.8679 - val_loss: 0.3819 - val_binary_accuracy: 0.8490
Epoch 100/100
39883/39883 [==============================] - 17s 422us/step - loss: 0.3025 - binary_accuracy: 0.8699 - val_loss: 0.3872 - val_binary_accuracy: 0.8522
In [17]:
import matplotlib.pyplot as plt

plt.figure(figsize=(20,15))
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model Loss', fontsize = 20)
plt.ylabel('Loss', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

plt.figure(figsize=(20,15))
plt.plot(history.history['binary_accuracy'])
plt.plot(history.history['val_binary_accuracy'])
plt.title('Model Accuracy', fontsize = 20)
plt.ylabel('Accuracy', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

The training curves behave fairly normal, we reach quite a high accuracy of 85% of classification. Slight overfitting is present after approximately 20 epochs of training, however nothing very dangerous, the training and validation curves are still in a close proximity from each other.

In [18]:
from sklearn.metrics import confusion_matrix
import itertools

plt.figure(figsize=(15,10))

predicted_labels = model.predict(np.stack(test_features))
cm = confusion_matrix(np.argmax(test_labels, axis=1), 
                      np.argmax(predicted_labels, axis=1))
print('Confusion matrix:\n',cm)

cm = cm.astype('float') / cm.sum(axis = 1)[:, np.newaxis]

plt.imshow(cm, cmap = plt.cm.Blues)
plt.title('Normalized confusion matrix', fontsize = 20)
plt.colorbar()
plt.xlabel('True label', fontsize = 20)
plt.ylabel('Predicted label', fontsize = 20)
for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
    plt.text(j, i, format(cm[i, j], '.2f'),
             horizontalalignment = 'center', verticalalignment = 'center', fontsize = 20,
             color='white' if cm[i, j] > 0.5 else 'black')
plt.show()
Confusion matrix:
 [[5672  602]
 [1217 4973]]
In [19]:
scores = model.evaluate(test_features, test_labels, verbose=0)
print("Accuracy: %.2f%%" % (scores[1]*100))
Accuracy: 85.41%

The confusion matrix and final evaluation on the test data set demonstrate a good accuracy of 85% of sequence classification. At some point it would be interesting to have a look at the missclassified sequneces in order to understand what is wrong with them an why the CNN faild to classify them correctly. For each sequence we can not construct so-called Saliency Map, i.e. importance of each nucleotide for sequence classification. In this way we can see whether the CNN learnt certain patterns to be most important for gene vs. intergenic sequence classification.

In [20]:
import keras.backend as K

def compute_salient_bases(model, x):
    input_tensors = [model.input]
    gradients = model.optimizer.get_gradients(model.output[0][1], model.input)
    compute_gradients = K.function(inputs = input_tensors, outputs = gradients)
    
    x_value = np.expand_dims(x, axis=0)
    gradients = compute_gradients([x_value])[0][0]
    sal = np.clip(np.sum(np.multiply(gradients,x), axis=1),a_min=0, a_max=None)
    return sal
In [21]:
sequence_index = 12
K.set_learning_phase(1) #set learning phase
sal = compute_salient_bases(model, input_features[sequence_index])

plt.figure(figsize=[16,5])
zoom = len(sal)
barlist = plt.bar(np.arange(len(sal[0:zoom])), sal[0:zoom])
plt.xlabel('Bases')
plt.ylabel('Magnitude of saliency values')
plt.xticks(np.arange(len(sal[0:zoom])), list(sequences[sequence_index][0:zoom]), size = 6);
plt.title('Saliency map for bases in one of the sequences');
plt.show()

Neanderthal Introgressed vs. Depleted Sequence Classification

Now we are going to apply a similar methodology as the one used for gene vs. intergenic sequence classification, but now we will apply it to classify regions of Neanderthal introgression within modern human genome vs. regions of depleted Neanderthal ancestry. We are goin to train our CNN model on Neanderthal introgressed haplotypes identified using S*-statistic in the study of Vernot and Akey, Science 2016, on Europeans and Asians from the 1000 Genomes project from here https://drive.google.com/drive/folders/0B9Pc7_zItMCVWUp6bWtXc2xJVkk. We used the coordinates of introgressed haplotypes from the file introgressed_haplotypes/LL.callsetEUR.mr_0.99.neand_calls_by_hap.bed.merged.by_chr.bed and selected only unique coordinates, so we ended up with 83 601 regions of Neanderthal introgression in modern Europeans. Let us read these coordinates and have a look at the length distribution of the Neanderthal introgressed regions.

In [1]:
import os
import pandas as pd
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
intr_coords = pd.read_csv('Akey_intr_coords.bed', header = None, sep = "\t")
intr_coords.head()
Out[1]:
0 1 2
0 chr1 2903159 2915884
1 chr1 2932446 2972497
2 chr1 2960608 2996556
3 chr1 2960608 2999518
4 chr1 2960608 3001253
In [2]:
intr_coords.shape
Out[2]:
(83601, 3)
In [4]:
import seaborn as sns
import matplotlib.pyplot as plt
plt.figure(figsize=(20,15))
intr_lengths = intr_coords.iloc[:, 2]-intr_coords.iloc[:, 1]
sns.distplot(intr_lengths)
plt.title("Distribution of Lengths of Neandertal Introgressed Regions", fontsize = 20)
plt.xlabel("Lengths of Neandertal Introgressed Regions", fontsize = 20)
plt.ylabel("Frequency", fontsize = 20)
plt.show()
In [5]:
from scipy import stats
print(stats.describe(intr_lengths))
DescribeResult(nobs=83601, minmax=(10002, 1194940), mean=92137.5803877944, variance=6139175379.516866, skewness=2.7095069302166377, kurtosis=11.38558261567325)

We can see that the regions of Neanderthal introgression are much longer compared to the lengths of genes. Here we have the sortest introgressed haplotype of length 10 kb with the median length around 100 kb. Let us now as previously use samtools for extracting the sequences of the Neanderthal introgressed regions from the hg19 build of human reference genome.

In [6]:
import os
import subprocess
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
a = 0
with open('hg19_intr_regions.fa', 'a') as fp:
    for i in range(intr_coords.shape[0]):
        coord = str(str(intr_coords.iloc[i, 0]) + ':' 
                    + str(intr_coords.iloc[i, 1]) + '-' + str(intr_coords.iloc[i, 2]))
        subprocess.run(['samtools', 'faidx', 'hg19.fa.gz', str(coord)], stdout = fp)
        a = a + 1
        if a%10000 == 0:
            print('Finished ' + str(a) + ' Neanderthal introgressed haplotypes')
Finished 10000 Neanderthal introgressed haplotypes
Finished 20000 Neanderthal introgressed haplotypes
Finished 30000 Neanderthal introgressed haplotypes
Finished 40000 Neanderthal introgressed haplotypes
Finished 50000 Neanderthal introgressed haplotypes
Finished 60000 Neanderthal introgressed haplotypes
Finished 70000 Neanderthal introgressed haplotypes
Finished 80000 Neanderthal introgressed haplotypes

Next we will need to build a fasta-file with sequences of the same lengths but located outside of the Neanderthal introgressed regions. For this purpose we need the lengths of the human chromosomes which we can easily extract from the fai-file generated when indexing the hg19 human reference genome.

In [6]:
chr_sizes = pd.read_csv("hg19.fa.gz.fai", header = None, sep = "\t")
chr_sizes = chr_sizes.drop([2, 3, 4], axis = 1)
chr_sizes.head()
Out[6]:
0 1
0 chr1 249250621
1 chr2 243199373
2 chr3 198022430
3 chr4 191154276
4 chr5 180915260

Now for each gene in the intr_coords DataFrame, we are going to randomly draw a region of the same length as the gene on the same chromosome and check whether this region overlaps with any other gene (not only with this one) on the same chromosome. If it does not, we will add this region to the depl_coords DataFrame. If it does overlap, we repeat the random drawing of the region gain and again until we succeeed in selecting a truly intergenic region of the same length as the given gene.

In [10]:
import numpy as np
chr_list = []
start_list = []
end_list = []
intr_lengths = list(intr_coords.iloc[:, 2] - intr_coords.iloc[:, 1])
a = 0
for i in range(intr_coords.shape[0]):
    chr_df = intr_coords[intr_coords[0].isin([intr_coords.iloc[i,0]])]
    overlap = True
    while overlap == True:
        reg_start = np.random.randint(1, int(chr_sizes[chr_sizes[0] == intr_coords.iloc[i,0]].iloc[:,1]))
        reg_end = reg_start + intr_lengths[i]
        for j in range(chr_df.shape[0]):
            b1 = chr_df.iloc[j,1]
            b2 = chr_df.iloc[j,2]
            if (reg_start > b1 and reg_start < b2) or (reg_end > b1 and reg_end < b2) or \
            (b1 > reg_start and b1 < reg_end) or (b2 > reg_start and b2 < reg_end):
                overlap = True
                break
            else:
                overlap = False
    chr_list.append(intr_coords.iloc[i,0])
    start_list.append(reg_start)
    end_list.append(reg_end)
    a = a + 1
    if a%10000 == 0:
            print('Finished ' + str(a) + ' Neanderthal introgressed haplotypes')
depl_coords = pd.DataFrame({'0': chr_list, '1': start_list, '2': end_list})
depl_coords.to_csv("Akey_depl_coords.bed", index = False, header = False, sep = "\t")
depl_coords.head()
Finished 10000 Neanderthal introgressed haplotypes
Finished 20000 Neanderthal introgressed haplotypes
Finished 30000 Neanderthal introgressed haplotypes
Finished 40000 Neanderthal introgressed haplotypes
Finished 50000 Neanderthal introgressed haplotypes
Finished 60000 Neanderthal introgressed haplotypes
Finished 70000 Neanderthal introgressed haplotypes
Finished 80000 Neanderthal introgressed haplotypes
Out[10]:
0 1 2
0 chr1 88370430 88383155
1 chr1 248465646 248505697
2 chr1 153182329 153218277
3 chr1 132975937 133014847
4 chr1 175694680 175735325

Let us make sure that we indeed managed to contruct a set of regions of equal size as the Neanderthal introgressed regions but not overlapping with the later ones. For this purpose, we will use bedtools intersect and feed the introgression and depletion coordinates, the tool should not find any overlapping intervals.

In [11]:
!bedtools intersect -a Akey_intr_coords.bed -b Akey_depl_coords.bed | wc -l
0

Excellent, the two sets of coordinates do not overlap. We can continue with using the depletion coordinates and extract the actual sequences from the human reference genome hg19 using samtools.

In [12]:
import os
import subprocess
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
a = 0
with open('hg19_depl_regions.fa', 'a') as fp:
    for i in range(depl_coords.shape[0]):
        coord = str(str(depl_coords.iloc[i, 0]) + ':' 
                    + str(depl_coords.iloc[i, 1]) + '-' + str(depl_coords.iloc[i, 2]))
        subprocess.run(['samtools', 'faidx', 'hg19.fa.gz', str(coord)], stdout = fp)
        a = a + 1
        if a%10000 == 0:
            print('Finished ' + str(a) + ' Neanderthal ancestry depleted regions')
Finished 10000 Neanderthal ancestry depleted regions
Finished 20000 Neanderthal ancestry depleted regions
Finished 30000 Neanderthal ancestry depleted regions
Finished 40000 Neanderthal ancestry depleted regions
Finished 50000 Neanderthal ancestry depleted regions
Finished 60000 Neanderthal ancestry depleted regions
Finished 70000 Neanderthal ancestry depleted regions
Finished 80000 Neanderthal ancestry depleted regions

Everything looks good at first glance, howeever taking a closer look we can see that we have quite a few sequences with N-nulceotides. This is a kind of missing data since N implies nucleotides that could not be identified by the sequencer. A fast way to see the presense of N-nucleoties is to grep them.

In [13]:
!grep -c N hg19_intr_regions.fa
863
In [14]:
!grep -c N hg19_depl_regions.fa
14823006

However we know that samtools splits sequences into multiple lines, so grep does not actually detect the number of N-containing entries but the number of N-containing lines. In order to be more precise, we can quickly count using Bio Python the numbers of entries in each fasta-file that contain at least one N-nucleotide.

In [15]:
from Bio import SeqIO

i = 0
for record in SeqIO.parse('hg19_intr_regions.fa', 'fasta'):
    upper_record = record.seq.upper()
    if 'N' in upper_record:
        i = i + 1
print('Introgressed regions file contains ' + str(i) + ' entries with at least one N-nucleotide')

j = 0
for record in SeqIO.parse('hg19_depl_regions.fa', 'fasta'):
    upper_record = record.seq.upper()
    if 'N' in upper_record:
        j = j + 1
print('Depleted regions file contains ' + str(j) + ' entries with at least one N-nucleotide')
Introgressed regions file contains 54 entries with at least one N-nucleotide
Depleted regions file contains 9817 entries with at least one N-nucleotide

In case fractions of missing data are different between Neanderthal introgressed and depleted regions, this might be treated as a signal by Convolutional Neural Networks (CNNs), and this is not what we are interested in. Therefore for simplicity we will remove all entries containing N-nucleotides in either Neanderthal introgressed or depleted regions. For example, if a Neanderthal introgressed region contains at least one N-nucleotide, we will drop the region together with the corresponding Neadnerthal depleted region despite that one might not contain N-nucleotides. This is needed for a better correspondence between the introgressed and depleted fasta-files.

In [16]:
from Bio import SeqIO

intr_file = 'hg19_intr_regions.fa'
depl_file = 'hg19_depl_regions.fa'
a = 0
i = 0
with open('hg19_intr_clean.fa', 'a') as intr_out, open('hg19_depl_clean.fa', 'a') as depl_out:
    for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
        upper_intr = intr.seq.upper()
        upper_depl = depl.seq.upper()
        a = a + 1
        if a%10000 == 0:
            print('Finished ' + str(a) + ' entries')
        if 'N' not in str(upper_intr) and 'N' not in str(upper_depl):
            intr.seq = upper_intr
            SeqIO.write(intr, intr_out, 'fasta')
            depl.seq = upper_depl
            SeqIO.write(depl, depl_out, 'fasta')
            i = i + 1
        else:
            continue
print('We have processed ' + str(a) + ' entries and written ' + str(i) + ' entries to two fasta-files')
Finished 10000 entries
Finished 20000 entries
Finished 30000 entries
Finished 40000 entries
Finished 50000 entries
Finished 60000 entries
Finished 70000 entries
Finished 80000 entries
We have processed 83601 entries and written 73734 entries to two fasta-files

Thus we have removed approximately 10000 entries, but still have plenty of sequences left to run CNNs for Neanderthal introgressed vs. depleted region classification. Now let us quickly check using grep whether we indeed are free of N-containing sequences.

In [17]:
!grep -c N hg19_intr_clean.fa
0
In [18]:
!grep -c N hg19_depl_clean.fa
0

Excellent! Now it is time to start building the input matrix of sequences to be fed int othe CNN. Since I have limited memory on my laptop, I will not read all the sequences into memory but extract first cut = 1000 nucleotides from each sequence. The Neanderthal introgressed and depleted sequences of length less than cut = 1000 nucleotides will be ignored and not included into the CNN input matrix. We also have to make sure that all 4 nucleotides are present at each sequence, i.e. we will omit e.g. AT or CG repeat sequences.

In [1]:
import os
from Bio import SeqIO

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

a = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    cut = 1000
    if len(str(intr.seq)) < cut or len(str(depl.seq)) < cut:
        continue
    s_intr = str(intr.seq)[0:cut]
    s_depl = str(depl.seq)[0:cut]
    if s_intr.count('A')>0 and s_intr.count('C')>0 and s_intr.count('G')>0 and s_intr.count('T')>0 and \
    s_depl.count('A')>0 and s_depl.count('C')>0 and s_depl.count('G')>0 and s_depl.count('T')>0:
        intr_seqs.append(s_intr)
        depl_seqs.append(s_depl)
    a = a + 1
    if a%10000 == 0:
        print('Finished ' + str(a) + ' entries')
Finished 10000 entries
Finished 20000 entries
Finished 30000 entries
Finished 40000 entries
Finished 50000 entries
Finished 60000 entries
Finished 70000 entries
In [2]:
sequences = intr_seqs + depl_seqs
len(sequences)
Out[2]:
147468
In [3]:
import numpy as np
labels = list(np.ones(len(intr_seqs))) + list(np.zeros(len(depl_seqs)))
len(labels)
Out[3]:
147468
In [4]:
from sklearn.preprocessing import LabelEncoder, OneHotEncoder

import warnings
warnings.filterwarnings('ignore')

integer_encoder = LabelEncoder()  
one_hot_encoder = OneHotEncoder()   
input_features = []

for sequence in sequences:
  integer_encoded = integer_encoder.fit_transform(list(sequence))
  integer_encoded = np.array(integer_encoded).reshape(-1, 1)
  one_hot_encoded = one_hot_encoder.fit_transform(integer_encoded)
  input_features.append(one_hot_encoded.toarray())

np.set_printoptions(threshold = 40)
input_features = np.stack(input_features)
print("Example sequence\n-----------------------")
print('DNA Sequence #1:\n',sequences[0][:10],'...',sequences[0][-10:])
print('One hot encoding of Sequence #1:\n',input_features[0].T)
Example sequence
-----------------------
DNA Sequence #1:
 AATGACATTA ... GGGATGGTGT
One hot encoding of Sequence #1:
 [[1. 1. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 1. 0.]
 [0. 0. 1. ... 1. 0. 1.]]
In [5]:
one_hot_encoder = OneHotEncoder()
labels = np.array(labels).reshape(-1, 1)
input_labels = one_hot_encoder.fit_transform(labels).toarray()

print('Labels:\n',labels.T)
print('One-hot encoded labels:\n',input_labels.T)
Labels:
 [[1. 1. 1. ... 0. 0. 0.]]
One-hot encoded labels:
 [[0. 0. 0. ... 1. 1. 1.]
 [1. 1. 1. ... 0. 0. 0.]]
In [6]:
from sklearn.model_selection import train_test_split

train_features, test_features, train_labels, test_labels = train_test_split(
    input_features, input_labels, test_size = 0.2, random_state = 42)
In [7]:
train_features.shape
Out[7]:
(117974, 1000, 4)
In [8]:
train_labels.shape
Out[8]:
(117974, 2)
In [9]:
test_features.shape
Out[9]:
(29494, 1000, 4)
In [10]:
test_labels.shape
Out[10]:
(29494, 2)
In [12]:
from keras.models import Sequential
from keras.regularizers import l2, l1
from keras.callbacks import ModelCheckpoint
from keras.optimizers import SGD, Adam, Adadelta, RMSprop
from keras.layers import Conv1D, Conv2D, Dense, MaxPooling1D, MaxPooling2D, Flatten, Dropout, Activation

import warnings
warnings.filterwarnings('ignore')

model = Sequential()
model.add(Conv1D(filters = 16, kernel_size = 5, padding = 'same', 
                 input_shape = (train_features.shape[1], train_features.shape[2])))
model.add(Activation("relu"))
model.add(MaxPooling1D(pool_size = 2))
#model.add(Dropout(0.1))

model.add(Flatten())
model.add(Dense(1000, kernel_regularizer = l1(0.00001)))
model.add(Activation("sigmoid"))
#model.add(Dropout(0.1))
model.add(Dense(2, activation = 'softmax'))

epochs = 100
lrate = 0.001
decay = lrate / epochs
sgd = SGD(lr = lrate, momentum = 0.9, decay = decay, nesterov = False)
#sgd = SGD(lr = lrate, momentum = 0.9, nesterov = False)
model.compile(loss = 'binary_crossentropy', optimizer = sgd, metrics = ['binary_accuracy'])
#model.compile(loss='binary_crossentropy', optimizer=Adam(lr = lrate), metrics=['binary_accuracy'])
checkpoint = ModelCheckpoint("weights.best.hdf5", monitor = 'val_binary_accuracy', verbose = 1, 
                             save_best_only = True, mode = 'max')
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv1d_2 (Conv1D)            (None, 1000, 16)          336       
_________________________________________________________________
activation_3 (Activation)    (None, 1000, 16)          0         
_________________________________________________________________
max_pooling1d_2 (MaxPooling1 (None, 500, 16)           0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 8000)              0         
_________________________________________________________________
dense_3 (Dense)              (None, 1000)              8001000   
_________________________________________________________________
activation_4 (Activation)    (None, 1000)              0         
_________________________________________________________________
dense_4 (Dense)              (None, 2)                 2002      
=================================================================
Total params: 8,003,338
Trainable params: 8,003,338
Non-trainable params: 0
_________________________________________________________________
In [14]:
import warnings
warnings.filterwarnings('ignore')

history = model.fit(train_features, train_labels, 
                    epochs = epochs, verbose = 1, validation_split = 0.2, batch_size = 32, shuffle = True, 
                    callbacks = [checkpoint])
Train on 94379 samples, validate on 23595 samples
Epoch 1/100
94379/94379 [==============================] - 307s 3ms/step - loss: 1.7224 - binary_accuracy: 0.5108 - val_loss: 1.7024 - val_binary_accuracy: 0.5140

Epoch 00001: val_binary_accuracy improved from -inf to 0.51396, saving model to weights.best.hdf5
Epoch 2/100
94379/94379 [==============================] - 315s 3ms/step - loss: 1.6905 - binary_accuracy: 0.5365 - val_loss: 1.6725 - val_binary_accuracy: 0.5497

Epoch 00002: val_binary_accuracy improved from 0.51396 to 0.54974, saving model to weights.best.hdf5
Epoch 3/100
94379/94379 [==============================] - 319s 3ms/step - loss: 1.6603 - binary_accuracy: 0.5561 - val_loss: 1.6470 - val_binary_accuracy: 0.5659

Epoch 00003: val_binary_accuracy improved from 0.54974 to 0.56588, saving model to weights.best.hdf5
Epoch 4/100
94379/94379 [==============================] - 322s 3ms/step - loss: 1.6295 - binary_accuracy: 0.5818 - val_loss: 1.6230 - val_binary_accuracy: 0.5736

Epoch 00004: val_binary_accuracy improved from 0.56588 to 0.57364, saving model to weights.best.hdf5
Epoch 5/100
94379/94379 [==============================] - 325s 3ms/step - loss: 1.6031 - binary_accuracy: 0.5909 - val_loss: 1.5997 - val_binary_accuracy: 0.5823

Epoch 00005: val_binary_accuracy improved from 0.57364 to 0.58228, saving model to weights.best.hdf5
Epoch 6/100
94379/94379 [==============================] - 323s 3ms/step - loss: 1.5754 - binary_accuracy: 0.6070 - val_loss: 1.5803 - val_binary_accuracy: 0.5831

Epoch 00006: val_binary_accuracy improved from 0.58228 to 0.58309, saving model to weights.best.hdf5
Epoch 7/100
94379/94379 [==============================] - 325s 3ms/step - loss: 1.5498 - binary_accuracy: 0.6168 - val_loss: 1.5602 - val_binary_accuracy: 0.5892

Epoch 00007: val_binary_accuracy improved from 0.58309 to 0.58915, saving model to weights.best.hdf5
Epoch 8/100
94379/94379 [==============================] - 328s 3ms/step - loss: 1.5274 - binary_accuracy: 0.6233 - val_loss: 1.5454 - val_binary_accuracy: 0.5887

Epoch 00008: val_binary_accuracy did not improve from 0.58915
Epoch 9/100
94379/94379 [==============================] - 330s 3ms/step - loss: 1.5054 - binary_accuracy: 0.6304 - val_loss: 1.5230 - val_binary_accuracy: 0.5964

Epoch 00009: val_binary_accuracy improved from 0.58915 to 0.59644, saving model to weights.best.hdf5
Epoch 10/100
94379/94379 [==============================] - 330s 3ms/step - loss: 1.4835 - binary_accuracy: 0.6367 - val_loss: 1.5061 - val_binary_accuracy: 0.5983

Epoch 00010: val_binary_accuracy improved from 0.59644 to 0.59830, saving model to weights.best.hdf5
Epoch 11/100
94379/94379 [==============================] - 331s 4ms/step - loss: 1.4633 - binary_accuracy: 0.6398 - val_loss: 1.4934 - val_binary_accuracy: 0.6006

Epoch 00011: val_binary_accuracy improved from 0.59830 to 0.60059, saving model to weights.best.hdf5
Epoch 12/100
94379/94379 [==============================] - 338s 4ms/step - loss: 1.4434 - binary_accuracy: 0.6482 - val_loss: 1.4740 - val_binary_accuracy: 0.6062

Epoch 00012: val_binary_accuracy improved from 0.60059 to 0.60623, saving model to weights.best.hdf5
Epoch 13/100
94379/94379 [==============================] - 329s 3ms/step - loss: 1.4264 - binary_accuracy: 0.6494 - val_loss: 1.4595 - val_binary_accuracy: 0.6048

Epoch 00013: val_binary_accuracy did not improve from 0.60623
Epoch 14/100
94379/94379 [==============================] - 331s 4ms/step - loss: 1.4087 - binary_accuracy: 0.6526 - val_loss: 1.4522 - val_binary_accuracy: 0.6030

Epoch 00014: val_binary_accuracy did not improve from 0.60623
Epoch 15/100
94379/94379 [==============================] - 331s 4ms/step - loss: 1.3914 - binary_accuracy: 0.6565 - val_loss: 1.4331 - val_binary_accuracy: 0.6088

Epoch 00015: val_binary_accuracy improved from 0.60623 to 0.60877, saving model to weights.best.hdf5
Epoch 16/100
94379/94379 [==============================] - 332s 4ms/step - loss: 1.3763 - binary_accuracy: 0.6578 - val_loss: 1.4313 - val_binary_accuracy: 0.5997

Epoch 00016: val_binary_accuracy did not improve from 0.60877
Epoch 17/100
94379/94379 [==============================] - 332s 4ms/step - loss: 1.3610 - binary_accuracy: 0.6589 - val_loss: 1.4258 - val_binary_accuracy: 0.5936

Epoch 00017: val_binary_accuracy did not improve from 0.60877
Epoch 18/100
94379/94379 [==============================] - 332s 4ms/step - loss: 1.3458 - binary_accuracy: 0.6618 - val_loss: 1.3932 - val_binary_accuracy: 0.6106

Epoch 00018: val_binary_accuracy improved from 0.60877 to 0.61064, saving model to weights.best.hdf5
Epoch 19/100
94379/94379 [==============================] - 329s 3ms/step - loss: 1.3313 - binary_accuracy: 0.6669 - val_loss: 1.3832 - val_binary_accuracy: 0.6086

Epoch 00019: val_binary_accuracy did not improve from 0.61064
Epoch 20/100
94379/94379 [==============================] - 334s 4ms/step - loss: 1.3181 - binary_accuracy: 0.6666 - val_loss: 1.3684 - val_binary_accuracy: 0.6135

Epoch 00020: val_binary_accuracy improved from 0.61064 to 0.61348, saving model to weights.best.hdf5
Epoch 21/100
94379/94379 [==============================] - 334s 4ms/step - loss: 1.3044 - binary_accuracy: 0.6685 - val_loss: 1.3574 - val_binary_accuracy: 0.6137

Epoch 00021: val_binary_accuracy improved from 0.61348 to 0.61369, saving model to weights.best.hdf5
Epoch 22/100
94379/94379 [==============================] - 337s 4ms/step - loss: 1.2916 - binary_accuracy: 0.6693 - val_loss: 1.3471 - val_binary_accuracy: 0.6121

Epoch 00022: val_binary_accuracy did not improve from 0.61369
Epoch 23/100
94379/94379 [==============================] - 334s 4ms/step - loss: 1.2793 - binary_accuracy: 0.6718 - val_loss: 1.3357 - val_binary_accuracy: 0.6139

Epoch 00023: val_binary_accuracy improved from 0.61369 to 0.61390, saving model to weights.best.hdf5
Epoch 24/100
94379/94379 [==============================] - 336s 4ms/step - loss: 1.2660 - binary_accuracy: 0.6739 - val_loss: 1.3246 - val_binary_accuracy: 0.6137

Epoch 00024: val_binary_accuracy did not improve from 0.61390
Epoch 25/100
94379/94379 [==============================] - 332s 4ms/step - loss: 1.2539 - binary_accuracy: 0.6739 - val_loss: 1.3334 - val_binary_accuracy: 0.5999

Epoch 00025: val_binary_accuracy did not improve from 0.61390
Epoch 26/100
94379/94379 [==============================] - 333s 4ms/step - loss: 1.2422 - binary_accuracy: 0.6764 - val_loss: 1.3092 - val_binary_accuracy: 0.6120

Epoch 00026: val_binary_accuracy did not improve from 0.61390
Epoch 27/100
94379/94379 [==============================] - 333s 4ms/step - loss: 1.2308 - binary_accuracy: 0.6777 - val_loss: 1.3196 - val_binary_accuracy: 0.6003

Epoch 00027: val_binary_accuracy did not improve from 0.61390
Epoch 28/100
94379/94379 [==============================] - 335s 4ms/step - loss: 1.2202 - binary_accuracy: 0.6781 - val_loss: 1.2860 - val_binary_accuracy: 0.6136

Epoch 00028: val_binary_accuracy did not improve from 0.61390
Epoch 29/100
94379/94379 [==============================] - 338s 4ms/step - loss: 1.2090 - binary_accuracy: 0.6805 - val_loss: 1.2781 - val_binary_accuracy: 0.6148

Epoch 00029: val_binary_accuracy improved from 0.61390 to 0.61479, saving model to weights.best.hdf5
Epoch 30/100
94379/94379 [==============================] - 337s 4ms/step - loss: 1.1993 - binary_accuracy: 0.6807 - val_loss: 1.2657 - val_binary_accuracy: 0.6148

Epoch 00030: val_binary_accuracy did not improve from 0.61479
Epoch 31/100
94379/94379 [==============================] - 347s 4ms/step - loss: 1.1894 - binary_accuracy: 0.6800 - val_loss: 1.2585 - val_binary_accuracy: 0.6148

Epoch 00031: val_binary_accuracy improved from 0.61479 to 0.61483, saving model to weights.best.hdf5
Epoch 32/100
94379/94379 [==============================] - 334s 4ms/step - loss: 1.1783 - binary_accuracy: 0.6836 - val_loss: 1.2529 - val_binary_accuracy: 0.6146

Epoch 00032: val_binary_accuracy did not improve from 0.61483
Epoch 33/100
94379/94379 [==============================] - 328s 3ms/step - loss: 1.1690 - binary_accuracy: 0.6842 - val_loss: 1.2547 - val_binary_accuracy: 0.6070

Epoch 00033: val_binary_accuracy did not improve from 0.61483
Epoch 34/100
94379/94379 [==============================] - 319s 3ms/step - loss: 1.1593 - binary_accuracy: 0.6850 - val_loss: 1.2315 - val_binary_accuracy: 0.6162

Epoch 00034: val_binary_accuracy improved from 0.61483 to 0.61623, saving model to weights.best.hdf5
Epoch 35/100
94379/94379 [==============================] - 321s 3ms/step - loss: 1.1500 - binary_accuracy: 0.6861 - val_loss: 1.2229 - val_binary_accuracy: 0.6161

Epoch 00035: val_binary_accuracy did not improve from 0.61623
Epoch 36/100
94379/94379 [==============================] - 323s 3ms/step - loss: 1.1413 - binary_accuracy: 0.6874 - val_loss: 1.2313 - val_binary_accuracy: 0.6093

Epoch 00036: val_binary_accuracy did not improve from 0.61623
Epoch 37/100
94379/94379 [==============================] - 325s 3ms/step - loss: 1.1322 - binary_accuracy: 0.6875 - val_loss: 1.2080 - val_binary_accuracy: 0.6176

Epoch 00037: val_binary_accuracy improved from 0.61623 to 0.61759, saving model to weights.best.hdf5
Epoch 38/100
94379/94379 [==============================] - 328s 3ms/step - loss: 1.1227 - binary_accuracy: 0.6885 - val_loss: 1.1989 - val_binary_accuracy: 0.6171

Epoch 00038: val_binary_accuracy did not improve from 0.61759
Epoch 39/100
94379/94379 [==============================] - 330s 3ms/step - loss: 1.1137 - binary_accuracy: 0.6907 - val_loss: 1.2202 - val_binary_accuracy: 0.6039

Epoch 00039: val_binary_accuracy did not improve from 0.61759
Epoch 40/100
94379/94379 [==============================] - 328s 3ms/step - loss: 1.1058 - binary_accuracy: 0.6911 - val_loss: 1.1858 - val_binary_accuracy: 0.6190

Epoch 00040: val_binary_accuracy improved from 0.61759 to 0.61903, saving model to weights.best.hdf5
Epoch 41/100
94379/94379 [==============================] - 333s 4ms/step - loss: 1.0969 - binary_accuracy: 0.6926 - val_loss: 1.1801 - val_binary_accuracy: 0.6177

Epoch 00041: val_binary_accuracy did not improve from 0.61903
Epoch 42/100
94379/94379 [==============================] - 332s 4ms/step - loss: 1.0876 - binary_accuracy: 0.6968 - val_loss: 1.1945 - val_binary_accuracy: 0.6075

Epoch 00042: val_binary_accuracy did not improve from 0.61903
Epoch 43/100
94379/94379 [==============================] - 334s 4ms/step - loss: 1.0803 - binary_accuracy: 0.6944 - val_loss: 1.1641 - val_binary_accuracy: 0.6185

Epoch 00043: val_binary_accuracy did not improve from 0.61903
Epoch 44/100
94379/94379 [==============================] - 334s 4ms/step - loss: 1.0724 - binary_accuracy: 0.6975 - val_loss: 1.1595 - val_binary_accuracy: 0.6178

Epoch 00044: val_binary_accuracy did not improve from 0.61903
Epoch 45/100
94379/94379 [==============================] - 335s 4ms/step - loss: 1.0639 - binary_accuracy: 0.6981 - val_loss: 1.1516 - val_binary_accuracy: 0.6220

Epoch 00045: val_binary_accuracy improved from 0.61903 to 0.62204, saving model to weights.best.hdf5
Epoch 46/100
94379/94379 [==============================] - 323s 3ms/step - loss: 1.0558 - binary_accuracy: 0.7018 - val_loss: 1.1520 - val_binary_accuracy: 0.6174

Epoch 00046: val_binary_accuracy did not improve from 0.62204
Epoch 47/100
94379/94379 [==============================] - 318s 3ms/step - loss: 1.0477 - binary_accuracy: 0.7028 - val_loss: 1.1391 - val_binary_accuracy: 0.6223

Epoch 00047: val_binary_accuracy improved from 0.62204 to 0.62225, saving model to weights.best.hdf5
Epoch 48/100
94379/94379 [==============================] - 320s 3ms/step - loss: 1.0393 - binary_accuracy: 0.7028 - val_loss: 1.1318 - val_binary_accuracy: 0.6243

Epoch 00048: val_binary_accuracy improved from 0.62225 to 0.62428, saving model to weights.best.hdf5
Epoch 49/100
94379/94379 [==============================] - 324s 3ms/step - loss: 1.0313 - binary_accuracy: 0.7054 - val_loss: 1.1278 - val_binary_accuracy: 0.6231

Epoch 00049: val_binary_accuracy did not improve from 0.62428
Epoch 50/100
94379/94379 [==============================] - 328s 3ms/step - loss: 1.0235 - binary_accuracy: 0.7063 - val_loss: 1.1193 - val_binary_accuracy: 0.6242

Epoch 00050: val_binary_accuracy did not improve from 0.62428
Epoch 51/100
94379/94379 [==============================] - 327s 3ms/step - loss: 1.0150 - binary_accuracy: 0.7108 - val_loss: 1.1198 - val_binary_accuracy: 0.6220

Epoch 00051: val_binary_accuracy did not improve from 0.62428
Epoch 52/100
94379/94379 [==============================] - 331s 4ms/step - loss: 1.0070 - binary_accuracy: 0.7126 - val_loss: 1.1105 - val_binary_accuracy: 0.6262

Epoch 00052: val_binary_accuracy improved from 0.62428 to 0.62623, saving model to weights.best.hdf5
Epoch 53/100
94379/94379 [==============================] - 332s 4ms/step - loss: 0.9983 - binary_accuracy: 0.7151 - val_loss: 1.1018 - val_binary_accuracy: 0.6289

Epoch 00053: val_binary_accuracy improved from 0.62623 to 0.62895, saving model to weights.best.hdf5
Epoch 54/100
94379/94379 [==============================] - 334s 4ms/step - loss: 0.9900 - binary_accuracy: 0.7172 - val_loss: 1.0989 - val_binary_accuracy: 0.6291

Epoch 00054: val_binary_accuracy improved from 0.62895 to 0.62912, saving model to weights.best.hdf5
Epoch 55/100
94379/94379 [==============================] - 336s 4ms/step - loss: 0.9804 - binary_accuracy: 0.7207 - val_loss: 1.0949 - val_binary_accuracy: 0.6309

Epoch 00055: val_binary_accuracy improved from 0.62912 to 0.63085, saving model to weights.best.hdf5
Epoch 56/100
94379/94379 [==============================] - 334s 4ms/step - loss: 0.9716 - binary_accuracy: 0.7243 - val_loss: 1.0878 - val_binary_accuracy: 0.6319

Epoch 00056: val_binary_accuracy improved from 0.63085 to 0.63187, saving model to weights.best.hdf5
Epoch 57/100
94379/94379 [==============================] - 336s 4ms/step - loss: 0.9621 - binary_accuracy: 0.7283 - val_loss: 1.0891 - val_binary_accuracy: 0.6277

Epoch 00057: val_binary_accuracy did not improve from 0.63187
Epoch 58/100
94379/94379 [==============================] - 336s 4ms/step - loss: 0.9521 - binary_accuracy: 0.7349 - val_loss: 1.0800 - val_binary_accuracy: 0.6308

Epoch 00058: val_binary_accuracy did not improve from 0.63187
Epoch 59/100
94379/94379 [==============================] - 338s 4ms/step - loss: 0.9415 - binary_accuracy: 0.7381 - val_loss: 1.0703 - val_binary_accuracy: 0.6357

Epoch 00059: val_binary_accuracy improved from 0.63187 to 0.63573, saving model to weights.best.hdf5
Epoch 60/100
94379/94379 [==============================] - 336s 4ms/step - loss: 0.9304 - binary_accuracy: 0.7440 - val_loss: 1.0667 - val_binary_accuracy: 0.6408

Epoch 00060: val_binary_accuracy improved from 0.63573 to 0.64077, saving model to weights.best.hdf5
Epoch 61/100
94379/94379 [==============================] - 339s 4ms/step - loss: 0.9181 - binary_accuracy: 0.7506 - val_loss: 1.0835 - val_binary_accuracy: 0.6320

Epoch 00061: val_binary_accuracy did not improve from 0.64077
Epoch 62/100
94379/94379 [==============================] - 339s 4ms/step - loss: 0.9059 - binary_accuracy: 0.7563 - val_loss: 1.0739 - val_binary_accuracy: 0.6303

Epoch 00062: val_binary_accuracy did not improve from 0.64077
Epoch 63/100
94379/94379 [==============================] - 338s 4ms/step - loss: 0.8906 - binary_accuracy: 0.7667 - val_loss: 1.0524 - val_binary_accuracy: 0.6462

Epoch 00063: val_binary_accuracy improved from 0.64077 to 0.64624, saving model to weights.best.hdf5
Epoch 64/100
94379/94379 [==============================] - 338s 4ms/step - loss: 0.8770 - binary_accuracy: 0.7729 - val_loss: 1.0558 - val_binary_accuracy: 0.6430

Epoch 00064: val_binary_accuracy did not improve from 0.64624
Epoch 65/100
94379/94379 [==============================] - 343s 4ms/step - loss: 0.8600 - binary_accuracy: 0.7825 - val_loss: 1.0457 - val_binary_accuracy: 0.6558

Epoch 00065: val_binary_accuracy improved from 0.64624 to 0.65582, saving model to weights.best.hdf5
Epoch 66/100
94379/94379 [==============================] - 340s 4ms/step - loss: 0.8406 - binary_accuracy: 0.7952 - val_loss: 1.0700 - val_binary_accuracy: 0.6284

Epoch 00066: val_binary_accuracy did not improve from 0.65582
Epoch 67/100
94379/94379 [==============================] - 342s 4ms/step - loss: 0.8221 - binary_accuracy: 0.8070 - val_loss: 1.0483 - val_binary_accuracy: 0.6437

Epoch 00067: val_binary_accuracy did not improve from 0.65582
Epoch 68/100
94379/94379 [==============================] - 320s 3ms/step - loss: 0.7996 - binary_accuracy: 0.8195 - val_loss: 1.0641 - val_binary_accuracy: 0.6599

Epoch 00068: val_binary_accuracy improved from 0.65582 to 0.65989, saving model to weights.best.hdf5
Epoch 69/100
94379/94379 [==============================] - 320s 3ms/step - loss: 0.7791 - binary_accuracy: 0.8326 - val_loss: 1.0512 - val_binary_accuracy: 0.6665

Epoch 00069: val_binary_accuracy improved from 0.65989 to 0.66645, saving model to weights.best.hdf5
Epoch 70/100
94379/94379 [==============================] - 319s 3ms/step - loss: 0.7532 - binary_accuracy: 0.8477 - val_loss: 1.0178 - val_binary_accuracy: 0.6648

Epoch 00070: val_binary_accuracy did not improve from 0.66645
Epoch 71/100
94379/94379 [==============================] - 320s 3ms/step - loss: 0.7268 - binary_accuracy: 0.8636 - val_loss: 1.0189 - val_binary_accuracy: 0.6779

Epoch 00071: val_binary_accuracy improved from 0.66645 to 0.67790, saving model to weights.best.hdf5
Epoch 72/100
94379/94379 [==============================] - 321s 3ms/step - loss: 0.6992 - binary_accuracy: 0.8794 - val_loss: 1.0802 - val_binary_accuracy: 0.6402

Epoch 00072: val_binary_accuracy did not improve from 0.67790
Epoch 73/100
94379/94379 [==============================] - 321s 3ms/step - loss: 0.6753 - binary_accuracy: 0.8927 - val_loss: 1.0098 - val_binary_accuracy: 0.6824

Epoch 00073: val_binary_accuracy improved from 0.67790 to 0.68239, saving model to weights.best.hdf5
Epoch 74/100
94379/94379 [==============================] - 324s 3ms/step - loss: 0.6503 - binary_accuracy: 0.9051 - val_loss: 1.0339 - val_binary_accuracy: 0.6690

Epoch 00074: val_binary_accuracy did not improve from 0.68239
Epoch 75/100
94379/94379 [==============================] - 321s 3ms/step - loss: 0.6228 - binary_accuracy: 0.9199 - val_loss: 1.0289 - val_binary_accuracy: 0.6723

Epoch 00075: val_binary_accuracy did not improve from 0.68239
Epoch 76/100
94379/94379 [==============================] - 323s 3ms/step - loss: 0.5949 - binary_accuracy: 0.9351 - val_loss: 1.0092 - val_binary_accuracy: 0.7063

Epoch 00076: val_binary_accuracy improved from 0.68239 to 0.70625, saving model to weights.best.hdf5
Epoch 77/100
94379/94379 [==============================] - 323s 3ms/step - loss: 0.5710 - binary_accuracy: 0.9483 - val_loss: 1.0153 - val_binary_accuracy: 0.7079

Epoch 00077: val_binary_accuracy improved from 0.70625 to 0.70786, saving model to weights.best.hdf5
Epoch 78/100
94379/94379 [==============================] - 321s 3ms/step - loss: 0.5467 - binary_accuracy: 0.9594 - val_loss: 1.0363 - val_binary_accuracy: 0.6834

Epoch 00078: val_binary_accuracy did not improve from 0.70786
Epoch 79/100
94379/94379 [==============================] - 330s 3ms/step - loss: 0.5248 - binary_accuracy: 0.9683 - val_loss: 1.0285 - val_binary_accuracy: 0.6868

Epoch 00079: val_binary_accuracy did not improve from 0.70786
Epoch 80/100
94379/94379 [==============================] - 329s 3ms/step - loss: 0.5036 - binary_accuracy: 0.9772 - val_loss: 1.0306 - val_binary_accuracy: 0.7191

Epoch 00080: val_binary_accuracy improved from 0.70786 to 0.71905, saving model to weights.best.hdf5
Epoch 81/100
94379/94379 [==============================] - 333s 4ms/step - loss: 0.4859 - binary_accuracy: 0.9831 - val_loss: 1.0770 - val_binary_accuracy: 0.6708

Epoch 00081: val_binary_accuracy did not improve from 0.71905
Epoch 82/100
94379/94379 [==============================] - 331s 4ms/step - loss: 0.4707 - binary_accuracy: 0.9878 - val_loss: 1.0337 - val_binary_accuracy: 0.7267

Epoch 00082: val_binary_accuracy improved from 0.71905 to 0.72672, saving model to weights.best.hdf5
Epoch 83/100
94379/94379 [==============================] - 333s 4ms/step - loss: 0.4542 - binary_accuracy: 0.9924 - val_loss: 1.1011 - val_binary_accuracy: 0.6701

Epoch 00083: val_binary_accuracy did not improve from 0.72672
Epoch 84/100
94379/94379 [==============================] - 335s 4ms/step - loss: 0.4390 - binary_accuracy: 0.9960 - val_loss: 1.0641 - val_binary_accuracy: 0.6872

Epoch 00084: val_binary_accuracy did not improve from 0.72672
Epoch 85/100
94379/94379 [==============================] - 336s 4ms/step - loss: 0.4290 - binary_accuracy: 0.9971 - val_loss: 1.0483 - val_binary_accuracy: 0.7327

Epoch 00085: val_binary_accuracy improved from 0.72672 to 0.73266, saving model to weights.best.hdf5
Epoch 86/100
94379/94379 [==============================] - 336s 4ms/step - loss: 0.4168 - binary_accuracy: 0.9988 - val_loss: 1.0558 - val_binary_accuracy: 0.7003

Epoch 00086: val_binary_accuracy did not improve from 0.73266
Epoch 87/100
94379/94379 [==============================] - 336s 4ms/step - loss: 0.4077 - binary_accuracy: 0.9993 - val_loss: 1.0508 - val_binary_accuracy: 0.7104

Epoch 00087: val_binary_accuracy did not improve from 0.73266
Epoch 88/100
94379/94379 [==============================] - 336s 4ms/step - loss: 0.3991 - binary_accuracy: 0.9995 - val_loss: 1.1129 - val_binary_accuracy: 0.6754

Epoch 00088: val_binary_accuracy did not improve from 0.73266
Epoch 89/100
94379/94379 [==============================] - 338s 4ms/step - loss: 0.3910 - binary_accuracy: 0.9998 - val_loss: 1.0614 - val_binary_accuracy: 0.7033

Epoch 00089: val_binary_accuracy did not improve from 0.73266
Epoch 90/100
94379/94379 [==============================] - 339s 4ms/step - loss: 0.3836 - binary_accuracy: 0.9999 - val_loss: 1.0774 - val_binary_accuracy: 0.6998

Epoch 00090: val_binary_accuracy did not improve from 0.73266
Epoch 91/100
94379/94379 [==============================] - 346s 4ms/step - loss: 0.3771 - binary_accuracy: 0.9999 - val_loss: 1.0988 - val_binary_accuracy: 0.6884

Epoch 00091: val_binary_accuracy did not improve from 0.73266
Epoch 92/100
94379/94379 [==============================] - 340s 4ms/step - loss: 0.3712 - binary_accuracy: 1.0000 - val_loss: 1.0796 - val_binary_accuracy: 0.6984

Epoch 00092: val_binary_accuracy did not improve from 0.73266
Epoch 93/100
94379/94379 [==============================] - 326s 3ms/step - loss: 0.3653 - binary_accuracy: 1.0000 - val_loss: 1.0723 - val_binary_accuracy: 0.7073

Epoch 00093: val_binary_accuracy did not improve from 0.73266
Epoch 94/100
94379/94379 [==============================] - 339s 4ms/step - loss: 0.3601 - binary_accuracy: 1.0000 - val_loss: 1.0879 - val_binary_accuracy: 0.6961

Epoch 00094: val_binary_accuracy did not improve from 0.73266
Epoch 95/100
94379/94379 [==============================] - 352s 4ms/step - loss: 0.3555 - binary_accuracy: 1.0000 - val_loss: 1.0944 - val_binary_accuracy: 0.6950

Epoch 00095: val_binary_accuracy did not improve from 0.73266
Epoch 96/100
94379/94379 [==============================] - 345s 4ms/step - loss: 0.3506 - binary_accuracy: 1.0000 - val_loss: 1.1128 - val_binary_accuracy: 0.6886

Epoch 00096: val_binary_accuracy did not improve from 0.73266
Epoch 97/100
94379/94379 [==============================] - 357s 4ms/step - loss: 0.3464 - binary_accuracy: 1.0000 - val_loss: 1.0950 - val_binary_accuracy: 0.6954

Epoch 00097: val_binary_accuracy did not improve from 0.73266
Epoch 98/100
94379/94379 [==============================] - 363s 4ms/step - loss: 0.3422 - binary_accuracy: 1.0000 - val_loss: 1.0871 - val_binary_accuracy: 0.7008

Epoch 00098: val_binary_accuracy did not improve from 0.73266
Epoch 99/100
94379/94379 [==============================] - 344s 4ms/step - loss: 0.3380 - binary_accuracy: 1.0000 - val_loss: 1.0816 - val_binary_accuracy: 0.7071

Epoch 00099: val_binary_accuracy did not improve from 0.73266
Epoch 100/100
94379/94379 [==============================] - 343s 4ms/step - loss: 0.3340 - binary_accuracy: 1.0000 - val_loss: 1.0939 - val_binary_accuracy: 0.6991

Epoch 00100: val_binary_accuracy did not improve from 0.73266
In [16]:
import matplotlib.pyplot as plt

plt.figure(figsize=(20,15))
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model Loss', fontsize = 20)
plt.ylabel('Loss', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

plt.figure(figsize=(20,15))
plt.plot(history.history['binary_accuracy'])
plt.plot(history.history['val_binary_accuracy'])
plt.title('Model Accuracy', fontsize = 20)
plt.ylabel('Accuracy', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

We can see that the model starts overfitting almost immediately and it is hard to regularize it, i.e. adding heavier Dropout prevents model from starting leraning. Hard to say why we observe this behavior. The number of examples for training is almost 100 000 which seems to be a lot, although it is still not enough if you look at how many parameters the CNN used. What we definitely observe is that increasing the length of the segments cut provides higher and higher accuracy of classification. Here I am restricted by the memory of my laptop and can not further increase the length of the segments. In addition, it would lead to even larger parameter space, i.e. even heavier overfitting. Perhaps more examples would help, for example chopping each Neanderthal introgression segment into multiple short ~100 bp segments would increase the training set without increasing the parameter space.

In [17]:
from sklearn.metrics import confusion_matrix
import itertools

plt.figure(figsize=(15,10))

predicted_labels = model.predict(np.stack(test_features))
cm = confusion_matrix(np.argmax(test_labels, axis=1), 
                      np.argmax(predicted_labels, axis=1))
print('Confusion matrix:\n',cm)

cm = cm.astype('float') / cm.sum(axis = 1)[:, np.newaxis]

plt.imshow(cm, cmap = plt.cm.Blues)
plt.title('Normalized confusion matrix', fontsize = 20)
plt.colorbar()
plt.xlabel('True label', fontsize = 20)
plt.ylabel('Predicted label', fontsize = 20)
for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
    plt.text(j, i, format(cm[i, j], '.2f'),
             horizontalalignment = 'center', verticalalignment = 'center', fontsize = 20,
             color='white' if cm[i, j] > 0.5 else 'black')
plt.show()
Confusion matrix:
 [[ 9864  5049]
 [ 3554 11027]]
In [20]:
scores = model.evaluate(test_features, test_labels, verbose=0)
print("Accuracy: %.0f%%" % (scores[1]*100))
Accuracy: 71%

We reach quite good accuracy of 71% on the test data set. This is not fantastic but perhaps can be improved by a more clever selection of the regions of Neanderthal introgression, i.e. applying RepeatMasker or something like this. Again, similar to the gene vs. intergenic regions calssification, for each sequence we can not construct so-called Saliency Map, i.e. importance of each nucleotide for sequence classification. In this way we can see whether the CNN learnt certain patterns to be most important for Neanderthal introgression vs. depletion sequence classification.

In [21]:
import keras.backend as K

def compute_salient_bases(model, x):
    input_tensors = [model.input]
    gradients = model.optimizer.get_gradients(model.output[0][1], model.input)
    compute_gradients = K.function(inputs = input_tensors, outputs = gradients)
    
    x_value = np.expand_dims(x, axis=0)
    gradients = compute_gradients([x_value])[0][0]
    sal = np.clip(np.sum(np.multiply(gradients,x), axis=1),a_min=0, a_max=None)
    return sal
In [22]:
sequence_index = 5
K.set_learning_phase(1) #set learning phase
sal = compute_salient_bases(model, input_features[sequence_index])

plt.figure(figsize=[16,5])
zoom = len(sal)
barlist = plt.bar(np.arange(len(sal[0:zoom])), sal[0:zoom])
plt.xlabel('Bases')
plt.ylabel('Magnitude of saliency values')
plt.xticks(np.arange(len(sal[0:zoom])), list(sequences[sequence_index][0:zoom]), size = 6);
plt.title('Saliency map for bases in one of the sequences');
plt.show()

2D CNN for Neanderthal Introgressed vs. Depleted Sequence Classification

Here we will try to convert our sequences into 2D images and run a 2D Convolutional Neural Network (CNN) as if we were to solve a pattern recognition task for image classification. The problem is that the final numpy array will be 4-dimensional: first dimension is the number of sequences / statistical observations / examples for training, second and third dimensions will be the 2D image size (aka in pixels, where one pixel means one nucleotide), fourth dimension will be the probability to observe one of the 4 nucleotides (A, C, T or G) or this can be seen as a dimension corresponding to 4 channels in a CMYK color coding. Because of the 4D structure of the input data set, I will need more memory than I have on my laptop. Currently I can input 1000bp stretches of DNA (with the following one-hot-encoding) when implementing 1D CNN, or alternatively I can input 32bp stretches of DNA working with 2D CNN since $32x32 \approx 1000$.

In [ ]:
import os
from Bio import SeqIO

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

a = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    cut = 32
    if len(str(intr.seq)) < cut or len(str(depl.seq)) < cut:
        continue
    s_intr = str(intr.seq)[0:cut]
    s_depl = str(depl.seq)[0:cut]
    if s_intr.count('A')>0 and s_intr.count('C')>0 and s_intr.count('G')>0 and s_intr.count('T')>0 and \
    s_depl.count('A')>0 and s_depl.count('C')>0 and s_depl.count('G')>0 and s_depl.count('T')>0:
        intr_seqs.append(s_intr)
        depl_seqs.append(s_depl)
    a = a + 1
    if a%10000 == 0:
        print('Finished ' + str(a) + ' entries')
In [ ]:
sequences = intr_seqs + depl_seqs
len(sequences)
In [ ]:
import numpy as np
labels = list(np.ones(len(intr_seqs))) + list(np.zeros(len(depl_seqs)))
len(labels)
In [ ]:
from sklearn.preprocessing import LabelEncoder, OneHotEncoder

import warnings
warnings.filterwarnings('ignore')

integer_encoder = LabelEncoder()  
one_hot_encoder = OneHotEncoder()   
input_features = []

for sequence in sequences:
  integer_encoded = integer_encoder.fit_transform(list(sequence))
  integer_encoded = np.array(integer_encoded).reshape(-1, 1)
  one_hot_encoded = one_hot_encoder.fit_transform(integer_encoded)
  input_features.append(one_hot_encoded.toarray())

np.set_printoptions(threshold = 40)
input_features = np.stack(input_features)
print("Example sequence\n-----------------------")
print('DNA Sequence #1:\n',sequences[0][:10],'...',sequences[0][-10:])
print('One hot encoding of Sequence #1:\n',input_features[0].T)
In [ ]:
one_hot_encoder = OneHotEncoder()
labels = np.array(labels).reshape(-1, 1)
input_labels = one_hot_encoder.fit_transform(labels).toarray()

print('Labels:\n',labels.T)
print('One-hot encoded labels:\n',input_labels.T)
In [ ]:
input_features.shape
In [ ]:
B = np.array(np.repeat(input_features[:, :, np.newaxis, :], repeats = input_features.shape[1], axis = 2))
B.shape
In [ ]:
from sklearn.model_selection import train_test_split

train_features, test_features, train_labels, test_labels = train_test_split(
    B, input_labels, test_size = 0.2, random_state = 42)
In [ ]:
from keras.models import Sequential
from keras.regularizers import l2, l1
from keras.callbacks import ModelCheckpoint
from keras.optimizers import SGD, Adam, Adadelta, RMSprop
from keras.layers import Conv1D, Conv2D, Dense, MaxPooling1D, MaxPooling2D, Flatten, Dropout, Activation

import warnings
warnings.filterwarnings('ignore')

model = Sequential()
model.add(Conv2D(filters = 32, kernel_size = (5, 5), padding = 'same', 
                 input_shape = (train_features.shape[1], train_features.shape[2], train_features.shape[3])))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size = (2, 2)))
#model.add(Dropout(0.1))

model.add(Flatten())
model.add(Dense(10)) # kernel_regularizer = l1(0.00001)
model.add(Activation("sigmoid"))
#model.add(Dropout(0.1))
model.add(Dense(2, activation = 'softmax'))

epochs = 100
lrate = 0.001
decay = lrate / epochs
sgd = SGD(lr = lrate, momentum = 0.9, decay = decay, nesterov = False)
#sgd = SGD(lr = lrate, momentum = 0.9, nesterov = False)
model.compile(loss = 'binary_crossentropy', optimizer = sgd, metrics = ['binary_accuracy'])
#model.compile(loss='binary_crossentropy', optimizer=Adam(lr = lrate), metrics=['binary_accuracy'])
checkpoint = ModelCheckpoint("weights.best.hdf5", monitor = 'val_binary_accuracy', verbose = 1, 
                             save_best_only = True, mode = 'max')
model.summary()
In [ ]:
import warnings
warnings.filterwarnings('ignore')

history = model.fit(train_features, train_labels, 
                    epochs = epochs, verbose = 1, validation_split = 0.2, batch_size = 32, shuffle = True, 
                    callbacks = [checkpoint])

Intersect Neanderthal Segments with Genes

Here we are going to ask a simple question: do human genes from hg19 refernce genome predominantly overlap with Neanderthal introgressed or depleted regions? To answer this question we are going to do heavy randomizations. First, we will use bedtools to calculate the number of intersects between Neanderthal introgressed regions and genes. Second, we are going to randomly draw Neanderthal depleted regions of the same length as the introgressed regions a number of times and calculate intersects with the genes for each draw. In this way we will plot the distribution of the intersects between the depleted and gene regions and show whether the genes predominantly overlap with the Neanderthal introgressed or depleted regions. Let us start with reading the coordinates of the introgressed regions:

In [1]:
import os
import pandas as pd
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
intr_coords = pd.read_csv('Akey_intr_coords.bed', header = None, sep = "\t")
intr_coords.head()
Out[1]:
0 1 2
0 chr1 2903159 2915884
1 chr1 2932446 2972497
2 chr1 2960608 2996556
3 chr1 2960608 2999518
4 chr1 2960608 3001253

Now let us check the number of intersects between the introgressed regions and the genes from hg19 human reference genome:

In [2]:
%%bash
Path=/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes
bedtools intersect -a Akey_intr_coords.bed -b $Path/gene_coords.txt | wc -l
140821

And for comparision let us check the number of intersects between the depleted regions and the genes from hg19 human reference genome:

In [3]:
%%bash
Path=/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes
bedtools intersect -a Akey_depl_coords.bed -b $Path/gene_coords.txt | wc -l
150438

We can see that genes intersect with depleted regions more often than with the introgressed ones. However what if we just managed by chance select "good" depleted regions that happen to overlap more often with the genes. To make a robust statement about overlapping of introgressed vs. depleted regions with genes we need to randomly draw depleted regions at least a few more times. To do that we again need to know the lengths of the chromosomes from hg19.

In [4]:
chr_sizes = pd.read_csv("hg19.fa.gz.fai", header = None, sep = "\t")
chr_sizes = chr_sizes.drop([2, 3, 4], axis = 1)
chr_sizes.head()
Out[4]:
0 1
0 chr1 249250621
1 chr2 243199373
2 chr3 198022430
3 chr4 191154276
4 chr5 180915260

Now we repeat drawing depleted regions a few times. This procedure takes quite a lot of time due to my non-optimal code implementation. So I will repeat this ~10-20 times, which should be enough to demonstrate that the number of intersects between depeleted regions and genes is always larger than between the introgressed regions and genes.

In [5]:
import os
import subprocess
import numpy as np
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

perm_n = []
for k in range(5):
    chr_list = []
    start_list = []
    end_list = []
    intr_lengths = list(intr_coords.iloc[:, 2] - intr_coords.iloc[:, 1])
    a = 0
    for i in range(intr_coords.shape[0]):
        chr_df = intr_coords[intr_coords[0].isin([intr_coords.iloc[i,0]])]
        overlap = True
        while overlap == True:
            reg_start = np.random.randint(1, int(chr_sizes[chr_sizes[0]==intr_coords.iloc[i,0]].iloc[:,1]))
            reg_end = reg_start + intr_lengths[i]
            for j in range(chr_df.shape[0]):
                b1 = chr_df.iloc[j,1]
                b2 = chr_df.iloc[j,2]
                if (reg_start > b1 and reg_start < b2) or (reg_end > b1 and reg_end < b2) or \
                (b1 > reg_start and b1 < reg_end) or (b2 > reg_start and b2 < reg_end):
                    overlap = True
                    break
                else:
                    overlap = False
        chr_list.append(intr_coords.iloc[i,0])
        start_list.append(reg_start)
        end_list.append(reg_end)
        a = a + 1
        if a%20000 == 0:
            print('Finished ' + str(a) + ' Neanderthal haplotypes')
    depl_coords = pd.DataFrame({'0': chr_list, '1': start_list, '2': end_list})
    depl_coords.to_csv("temp.txt", index = False, header = False, sep = "\t")

    genes_path = '/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/'
    with open('n.txt', 'w') as fp:
        subprocess.run(['bedtools', 'intersect', '-a', 'temp.txt', '-b', 
                        genes_path + 'gene_coords.txt'], stdout = fp)
    akey_n = pd.read_csv('n.txt', header = None, sep = "\t")
    print(k, akey_n.shape[0])
    print('**********************************************************')
    perm_n.append(akey_n.shape[0])
Finished 20000 Neanderthal haplotypes
Finished 40000 Neanderthal haplotypes
Finished 60000 Neanderthal haplotypes
Finished 80000 Neanderthal haplotypes
0 150060
**********************************************************
Finished 20000 Neanderthal haplotypes
Finished 40000 Neanderthal haplotypes
Finished 60000 Neanderthal haplotypes
Finished 80000 Neanderthal haplotypes
1 151156
**********************************************************
Finished 20000 Neanderthal haplotypes
Finished 40000 Neanderthal haplotypes
Finished 60000 Neanderthal haplotypes
Finished 80000 Neanderthal haplotypes
2 149792
**********************************************************
Finished 20000 Neanderthal haplotypes
Finished 40000 Neanderthal haplotypes
Finished 60000 Neanderthal haplotypes
Finished 80000 Neanderthal haplotypes
3 150381
**********************************************************
Finished 20000 Neanderthal haplotypes
Finished 40000 Neanderthal haplotypes
Finished 60000 Neanderthal haplotypes
Finished 80000 Neanderthal haplotypes
4 150075
**********************************************************
In [9]:
perm_n = [150438, 150340, 149772, 149798, 149664, 150748, 151154, 
          150818, 150498, 151426, 151244, 151267, 150060, 151156, 149792, 150381, 150075]
perm_n
Out[9]:
[150438,
 150340,
 149772,
 149798,
 149664,
 150748,
 151154,
 150818,
 150498,
 151426,
 151244,
 151267,
 150060,
 151156,
 149792,
 150381,
 150075]
In [12]:
import seaborn as sns
import matplotlib.pyplot as plt
plt.figure(figsize=(20,12))
plt.axvline(x = 140821, linewidth = 4, color = 'r')
sns.distplot(perm_n)
plt.title("Distribution of Gene-Depletion Intersects: Vernot and Akey, Science 2016", fontsize = 30)
plt.xlabel("Number of Intersects Between Gene and Neanderthal Depleted Regions", fontsize = 30)
plt.ylabel("Frequency", fontsize = 30)
plt.show()
In [8]:
import seaborn as sns
import matplotlib.pyplot as plt
plt.figure(figsize=(20,15))
plt.axvline(x = 140821, linewidth = 4, color = 'r')
sns.distplot(perm_n)
plt.title("Distribution of Gene-Depletion Intersects: Vernot and Akey, Science 2016", fontsize = 20)
plt.xlabel("Number of Intersects Between Gene and Neanderthal Depleted Regions", fontsize = 20)
plt.ylabel("Frequency", fontsize = 20)
plt.show()

Here it is remarkable that none of the 17 drawings of depleted regions gives the number of intersects with genes which is below the number of intersects between the introgressed regions and the genes. If we were to calculate a p-value of the enrichment, this would be essentially but a more correct way to express it would pvalue < 1 / 17 = 0.06. This proves that genes predominantly overlap with the regions of depleted Neanderthal ancestry.

Ranking Genes by Their GC- and AT-Content

Now let us perform a simple exercise which will be very useful for us in the future. Here we are going to calculate GC-content of each gene from their sequences. We do have a fasta-file with sequences for each gene extracted from the hg19 build of human reference genome. Now it will be relatively easy to go throug those sequences one-by-one and count the percentages of C ad G nucleotides. The goal of this exercise is to figure out which genes in human genome are most GC-rich and most AT-rich, again, this we will need in the future.

In [1]:
import os
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/')

ids = []
seqs = []
GC_content = []
with open('hg19_gene_regions.fa','r') as fin:
    for line in fin:
        line = line.rstrip()
        if line[0] == '>' and len(seqs) == 0:
            ids.append(line)
        elif line[0] == '>' and len(seqs) > 0:
            my_seq = ''.join(seqs).upper()
            GC = round((my_seq.count('C') + my_seq.count('G')) / (len(my_seq) - my_seq.count('N')), 4) * 100
            GC_content.append(GC)
            seqs = []
            ids.append(line)
        else:
            seqs.append(line)
my_seq = ''.join(seqs).upper()
GC = round((my_seq.count('C') + my_seq.count('G')) / (len(my_seq) - my_seq.count('N')), 4) * 100
GC_content.append(GC)

GC_df = pd.DataFrame({'Coord': ids, 'GCcontent': GC_content})
GC_df = GC_df.sort_values(by = ['GCcontent'], ascending = False)
print('Most GC-rich genes:')
print(GC_df.head())
print('\n')
print('Most AT-rich genes:')
print(GC_df.tail())
GC_df.to_csv('genes_GC_content.txt', index = False, header = True, sep = '\t')
Most GC-rich genes:
                                Coord  GCcontent
32749  >chr6_cox_hap2:4729257-4729276      100.0
33853  >chr6_qbl_hap6:4517689-4517708      100.0
30984         >chr6:33285442-33285461      100.0
33032  >chr6_dbb_hap3:4566766-4566785      100.0
33562  >chr6_mcf_hap5:4759185-4759204      100.0


Most AT-rich genes:
                            Coord  GCcontent
10653    >chr14:37421514-37421596      24.10
31972   >chr6:140526389-140526463      21.33
9227   >chr12:116586365-116586459      21.05
10579    >chr14:28102411-28102484      20.27
28113   >chr4:126428414-126428462      16.33
In [4]:
%%bash
cd /home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/
samtools faidx hg19.fa.gz chr4:126428414-126428462 | sed '1d' | tr -d '\n'
CTGTAATATAAATTTAATTTATTCTCTATCATTAAAAAATGTATTACAG
In [5]:
my_str = 'CTGTAATATAAATTTAATTTATTCTCTATCATTAAAAAATGTATTACAG'
(my_str.count('C') + my_str.count('G')) / len(my_str)
Out[5]:
0.16326530612244897

Ok, it would be nice to see the gene symbols. It is trange to see genes with 100% GC-content, those must be some non-coding RNAs.

In [6]:
import pandas as pd
Path = '/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/'
gene_coords = pd.read_csv(Path + 'gene_coords.txt', header=None, sep="\t")
gene_coords['Coord'] = '>' + gene_coords[0].astype(str) + ':' \
+ gene_coords[1].astype(str) + '-' + gene_coords[2].astype(str)
gene_coords.head()
Out[6]:
0 1 2 3 Coord
0 chr1 11874 14409 DDX11L1 >chr1:11874-14409
1 chr1 14362 29370 WASH7P >chr1:14362-29370
2 chr1 17369 17436 MIR6859-1 >chr1:17369-17436
3 chr1 17369 17436 MIR6859-2 >chr1:17369-17436
4 chr1 17369 17436 MIR6859-3 >chr1:17369-17436
In [16]:
GC_df_gene = pd.merge(GC_df, gene_coords, on = 'Coord')
GC_df_gene = GC_df_gene[[3, 0, 1, 2, 'GCcontent']]
GC_df_gene.columns = ['Gene', 'Chr', 'Start', 'End', 'GCcontent']
GC_df_gene.head()
Out[16]:
Gene Chr Start End GCcontent
0 MIR1234 chr6_cox_hap2 4729257 4729276 100.0
1 MIR1234 chr6_qbl_hap6 4517689 4517708 100.0
2 MIR1234 chr6 33285442 33285461 100.0
3 MIR1234 chr6_dbb_hap3 4566766 4566785 100.0
4 MIR1234 chr6_mcf_hap5 4759185 4759204 100.0
In [17]:
GC_df_gene.tail()
Out[17]:
Gene Chr Start End GCcontent
44993 MIR4503 chr14 37421514 37421596 24.10
44994 MIR3668 chr6 140526389 140526463 21.33
44995 MIR620 chr12 116586365 116586459 21.05
44996 MIR3171 chr14 28102411 28102484 20.27
44997 MIR2054 chr4 126428414 126428462 16.33

Yes, indeed, the micro-RNAs (the MIR-genes) seem to be extreme in the GC-content. Let us filter them out and display most GC-rich and and most AT-rich genes again without the MIR-genes, LINC-genes or SNOR-genes which are the small non-coding RNAs.

In [22]:
GC_df_gene_noMIR = GC_df_gene[~GC_df_gene['Gene'].str.contains('MIR')]
GC_df_gene_noMIR = GC_df_gene_noMIR[~GC_df_gene_noMIR['Gene'].str.contains('LINC')]
GC_df_gene_noMIR = GC_df_gene_noMIR[~GC_df_gene_noMIR['Gene'].str.contains('SNOR')]
GC_df_gene_noMIR.head(10)
Out[22]:
Gene Chr Start End GCcontent
81 BHLHA9 chr17 1173858 1174565 78.67
93 UTF1 chr10 135043778 135045062 76.96
105 RNF225 chr19 58907457 58908446 75.96
108 NPB chr17 79859985 79860781 75.66
126 LKAAEAR1 chr20 62714733 62715712 75.20
127 LRRC26 chr9 140063212 140064491 75.16
132 CTXN1 chr19 7989381 7991051 74.93
140 ZNRF2P2 chr7 29724388 29725437 74.38
141 HES4 chr1 934344 935552 74.36
142 HES4 chr1 934342 935552 74.32
In [36]:
GC_df_gene_noMIR.to_csv('genes_GC_content.txt', index = False, header = True, sep = '\t')
GC_df_gene_noMIR.tail(10)
Out[36]:
Gene Chr Start End GCcontent
44833 SI chr3 164696686 164796283 31.26
44836 LOC105374704 chr5 29880667 29882210 31.22
44910 SPATA17-AS1 chr1 217954540 217958462 30.82
44912 GPR22 chr7 107110502 107116125 30.58
44916 CYLC2 chr9 105757593 105780770 30.37
44925 HIF1A-AS2 chr14 62213757 62215807 29.99
44937 ANGPTL3 chr1 63063158 63071976 29.49
44939 NTS chr12 86268073 86276770 29.39
44975 LOC102467226 chr5 120658245 120661532 27.43
44981 DEFB114 chr6 49928005 49931818 26.85

It is very interesting that there are human genes (that are not small non-coding RNAs) with GC-content as large as ~80% and there are genes with as little as ~30% of GC-content. Let us display the distribution of the GC-content across genes.

In [37]:
import seaborn as sns
import matplotlib.pyplot as plt
plt.figure(figsize=(20,15))
sns.distplot(GC_df_gene_noMIR['GCcontent'])
plt.title("Distribution of GC-content Across Genes from HG19 Human Reference Genome", fontsize = 20)
plt.xlabel("GC-content of Genes", fontsize = 20)
plt.ylabel("Frequency", fontsize = 20)
plt.show()

We can see that the mode is indeed at aroud 41% as it is expected to be, however the distribution does not look very symmetric and seems to have much more genes with GC-content larger than 41% than those with GC-content below 41%. This probably reflects the known statement that genes are ingeneral GC-rich regions. If we calculate the mean GC-content of the gene regions, it will be arounf 47% which is much higher than 41% across the whole human reference genome.

In [38]:
np.mean(GC_df_gene_noMIR['GCcontent'])
Out[38]:
47.254267483922824

Another very interesting observation is that there is ANGPTL3 gene which is very AT-rich with GC-content only ~30%. This gene has a very strong link to HDL cholesterol levels in human blood which is one of determnant factors for genetics of Type 2 Diabetes (T2D) Mellitus. Indeed, if we display a piece of the ANGPTL3 gene we observe that it is an extremely GC-poor gene.

In [34]:
%%bash
cd /home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/
samtools faidx hg19.fa.gz chr1:63063158-63071976 | sed '1d' | tr -d '\n' | cut -c1-500
ATATATAGAGTTAAGAAGTCTAGGTCTGCTTCCAGAAGAAAACAGTTCCACGTTGCTTGAAATTGAAAATCAAGATAAAAATGTTCACAATTAAGCTCCTTCTTTTTATTGTTCCTCTAGTTATTTCCTCCAGAATTGATCAAGACAATTCATCATTTGATTCTCTATCTCCAGAGCCAAAATCAAGATTTGCTATGTTAGACGATGTAAAAATTTTAGCCAATGGCCTCCTTCAGTTGGGACATGGTCTTAAAGACTTTGTCCATAAGACGAAGGGCCAAATTAATGACATATTTCAAAAACTCAACATATTTGATCAGTCTTTTTATGATCTATCGCTGCAAACCAGTGAAATCAAAGAAGAAGAAAAGGAACTGAGAAGAACTACATATAAACTACAAGTCAAAAATGAAGAGGTAAAGAATATGTCACTTGAACTCAACTCAAAACTTGAAAGCCTCCTAGAAGAAAAAATTCTACTTCAACAAAAAGTGAAATAT

Below we will show that AT-rich regions predominantly came to the human genome from Neanderthals. Particularly, the ANGPTL3 gene will be predicted to be a Neanderthal gene that was introgressed into the modern human genome ~2000 generations ago when humans migrated out of Africa and bred with Neanderthals in Europe and Asia.

Bag of Words for Neanderthal Introgressed vs. Depleted Sequence Classification

Here we are going to use elements of the Natural Language Processing (NLP) and K-mer analysis in order to find DNA motifs that differ between Neanderthal introgressed and depleted regions of human genome. DNA sequence is essentially a text so we can apply the whole power of NLP apparatus to DNA analysis. However, where are sentences and words in one large DNA sequence? It turns out that a sequence can be split into K-mers which represent words / tokens and those K-mers can be concatenated in a space-dilimited manner so that we get a sentence at the end. Now we are done, we cen use NLP!

In [1]:
from IPython.display import Image
Path = '/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/'
Image(Path + 'Kmers.png', width=2000)
Out[1]:

Here we will start with the simplest Bag of Words model that simply counts frequencies of words / K-mers between two different classes of texts / sequences. In our case we are asking: do Neanderthal introgressed regions contain more frequent K-mers compared to the regions of depleted Neanderthal ancestry? To answer this question we are going to read the introgressed and depleted sequences and split them into K-mers:

In [2]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    cutoff = 500
    my_intr_seq = str(intr.seq)[0:cutoff]
    my_depl_seq = str(depl.seq)[0:cutoff]
    
    intr_seqs.append(my_intr_seq)
    #my_intr_reverse_compliment_seq = str(Seq(my_intr_seq).reverse_complement())
    #intr_seqs.append(my_intr_reverse_compliment_seq)
    
    depl_seqs.append(my_depl_seq)
    #my_depl_reverse_compliment_seq = str(Seq(my_depl_seq).reverse_complement())
    #depl_seqs.append(my_depl_reverse_compliment_seq)
    
    e = e + 1
    if e%20000 == 0:
        print('Finished ' + str(e) + ' entries')
Finished 20000 entries
Finished 40000 entries
Finished 60000 entries

Now, after we have read the introgressed and depleted sequences into the memory, we will need to define a function that splits each given sequence into K-mers, and use it for preprocessing the sequences and converting them into sentences via K-mers construction. Suppose the sequences in the list above are different texts. It is natural to consider k-mers as words of those texts. Different sequences can share the same k-mers indicating that the same words can be used in different texts. However, there are words / k-mers that are specific for certain texts / sequences, or their number can say something about the topic of the text / biology of the sequence. Here we are going to split each sequence into k-mers and construct sentences which represent lists of words / k-mers.

In [3]:
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]
In [4]:
kmer = 5

print('Building Neanderthal introgressed sequences')
intr_sentences = []
for i in range(len(intr_seqs)):
    intr_sentences.append(getKmers(intr_seqs[i], kmer))

print('Building Neanderthal depleted sequences')
depl_sentences = []
for i in range(len(depl_seqs)):
    depl_sentences.append(getKmers(depl_seqs[i], kmer))
Building Neanderthal introgressed sequences
Building Neanderthal depleted sequences

The words / k-mers provide a vocabulary, we will determine its size later. We can also use the Counter class for an efficient counting of the words as well as displaying and making barplots of the most common words for Neanderthal introgressed and depleted regions.

In [6]:
from collections import Counter
import matplotlib.pyplot as plt
fig = plt.figure(figsize=(20,18))
fig.subplots_adjust(hspace = 0.4, wspace = 0.4)

plt.subplot(2, 1, 1)
D = dict(Counter([item for sublist in intr_sentences for item in sublist]).most_common(20))
plt.bar(range(len(D)), list(D.values()), align='center')
plt.title('Most Common K-mers for Neanderthal Introgressed Regions', fontsize = 20)
plt.ylabel("Counts", fontsize = 20)
plt.xticks(rotation = 90)
plt.xticks(range(len(D)), list(D.keys()), fontsize = 20)

plt.subplot(2, 1, 2)
D = dict(Counter([item for sublist in depl_sentences for item in sublist]).most_common(20))
plt.bar(range(len(D)), list(D.values()), align='center')
plt.title('Most Common K-mers for Neanderthal Depleted Regions', fontsize = 20)
plt.ylabel("Counts", fontsize = 20)
plt.xticks(rotation = 90)
plt.xticks(range(len(D)), list(D.keys()), fontsize = 20)

plt.show()

We can see that A- and T-rich K-mers seem to be most common for both Neanderthal introgressed and depleted regions. However, there are small differences in their counts that might be indicative for different composition between them. We can also build the data frames of K-mer counts for both Neanderthal introgressed and depleted regions and display them:

In [7]:
import pandas as pd
intr_counts = dict(Counter([item for sublist in intr_sentences for item in sublist]))
kmers = list(intr_counts.keys())
counts = list(intr_counts.values())
intr_df = pd.DataFrame({'Kmer': kmers, 'Count': counts})
intr_df = intr_df.sort_values(['Count'], ascending = False)
intr_df.head(10)
Out[7]:
Kmer Count
814 TTTTT 245205
365 AAAAA 205526
176 ATTTT 150409
571 AAAAT 146619
673 TATTT 126974
177 TTTTA 125572
466 TAAAA 121201
747 AAATA 120879
458 TTTCT 120240
421 AGAAA 116255
In [8]:
import pandas as pd
depl_counts = dict(Counter([item for sublist in depl_sentences for item in sublist]))
kmers = list(depl_counts.keys())
counts = list(depl_counts.values())
depl_df = pd.DataFrame({'Kmer': kmers, 'Count': counts})
depl_df = depl_df.sort_values(['Count'], ascending = False)
depl_df.head(10)
Out[8]:
Kmer Count
733 AAAAA 249058
402 TTTTT 246971
396 AAAAT 149885
401 ATTTT 148656
678 TATTT 124002
20 AAATA 123933
632 TAAAA 123646
496 TTTTA 122720
382 AGAAA 116562
349 TTTCT 116292

Both data frames contain the same K-mer vacabulary in the Kmer column but in different order due to different counts of the K-mers. Now we are going to merge the two data frames and calculate odds ratios between the counts for each K-mer. Then we will order the K-mers by the highest Odds ratio between depleted and introgressed regions. The highest Odds ratio shows which K-mers are most discriminative between the two classes of genomic regions.

In [9]:
merge_df = pd.merge(intr_df, depl_df, on = 'Kmer')
merge_df.columns = ['Kmer','Count_Intr','Count_Depl']
merge_df['Odds_Depl2Intr'] = merge_df['Count_Depl'] / merge_df['Count_Intr']
sorted_merge_df = merge_df.sort_values(['Odds_Depl2Intr'], ascending = False)
sorted_merge_df.head()
Out[9]:
Kmer Count_Intr Count_Depl Odds_Depl2Intr
1015 CCGCG 1926 3143 1.631880
1014 CGCGC 2185 3531 1.616018
865 GCGCC 5761 8439 1.464850
1017 CGGCG 1904 2649 1.391282
834 CGGCC 6758 9360 1.385025
In [10]:
sorted_merge_df.tail()
Out[10]:
Kmer Count_Intr Count_Depl Odds_Depl2Intr
588 GTCAT 32957 30285 0.918925
518 GGCTT 35933 32746 0.911307
1002 GCGTA 2967 2702 0.910684
905 TTACG 5006 4526 0.904115
999 TACGC 2974 2656 0.893073

We can easily see that the most discriminative K-mers between Neanderthal depleted and introgressed regions that are overrepresented in depleted regions are very GC-rich. This gives us a hint that ragions of depleted Neanderthal ancestry have something to do with the high GC-content. We remember from the previous section that Human Genes are typically GC-rich (47% GC-content against 41% GC-content genome-wide). Here we conclude that regions of depleted Neanderthal ancestry fall predominantly within human genes. This confirms the result of the section about intersects between gene regions and Neanderthal introgressed / depleted regions.

We can rank all the K-mers by the deviation of their Odds ratios from 1. This will demostrate feature importances or predictive power of each K-mer:

In [11]:
sorted_merge_df['PredictPower'] = abs(sorted_merge_df['Odds_Depl2Intr'] - 1)
sorted_merge_df.head()
Out[11]:
Kmer Count_Intr Count_Depl Odds_Depl2Intr PredictPower
1015 CCGCG 1926 3143 1.631880 0.631880
1014 CGCGC 2185 3531 1.616018 0.616018
865 GCGCC 5761 8439 1.464850 0.464850
1017 CGGCG 1904 2649 1.391282 0.391282
834 CGGCC 6758 9360 1.385025 0.385025
In [12]:
sorted_merge_df.tail()
Out[12]:
Kmer Count_Intr Count_Depl Odds_Depl2Intr PredictPower
588 GTCAT 32957 30285 0.918925 0.081075
518 GGCTT 35933 32746 0.911307 0.088693
1002 GCGTA 2967 2702 0.910684 0.089316
905 TTACG 5006 4526 0.904115 0.095885
999 TACGC 2974 2656 0.893073 0.106927
In [81]:
import os
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
sorted_merge_df.to_csv('BagOfWords_Neand_Intr_vs_Depleted.txt', header = True, index = False, sep = '\t')
In [68]:
import matplotlib.pyplot as plt
plt.figure(figsize=(20,15))
plt.bar(range(len(list(sorted_merge_df['Kmer']))), 
        list(sorted_merge_df['PredictPower']), align = 'center', width = 1)
plt.title('Predictive Power of K-mers', fontsize = 20)
plt.ylabel("Magnitude of Prediction", fontsize = 20)
plt.xticks(rotation = 90)
plt.xticks(list(range(len(list(sorted_merge_df['Kmer']))))[0::8], 
           list(sorted_merge_df['Kmer'])[0::8], fontsize = 8)
plt.show()

The y-axis of the barplot shows the amplitude of the deviation from 1 of the Odds ratios for each K-mer. The large is the deviation of the Odds ratio from 1, the more frequent is the K-mer in either depleted Neanderthal ancestry regions (left tail of the barplot) or introgressed Neanderthal regions (right tail of the barplot). Here I display every 8-th bar label for better visibility. Nevertheless, one can notice that the left tail of the barplot (K-mers frequent in the regions of depleted Neanderthal ancestry) contains mostly GC-rich K-mers. This implies that the regions of depleted Neanderthal ancestry are more likely to be the Human Gene regions as they are known to have higher GC-content compared to the average genome-wide level.

Out of curiosity, let us check what K-mers are most frequent in the ANGPTL3 gene which had a very low GC-content, i.e. was extremely AT-rich. Fist we read it sequence and split it inro K-mers, then we use the class Counter to count K-mers in the sequence which was previously converted to a list of K-mers.

In [70]:
%%bash
cd /home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/
samtools faidx hg19.fa.gz chr1:63063158-63071976 | sed '1d' | tr -d '\n' > ANGPTL3_sequence.txt
In [75]:
import os
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/')
with open('ANGPTL3_sequence.txt','r') as fin:
    ANGPTL3_seq = fin.read().upper()
In [78]:
import pandas as pd
ANGPTL3_counts = dict(Counter(getKmers(ANGPTL3_seq, 5)))
kmers = list(ANGPTL3_counts.keys())
counts = list(ANGPTL3_counts.values())
ANGPTL3_df = pd.DataFrame({'Kmer': kmers, 'Count': counts})
ANGPTL3_df = ANGPTL3_df.sort_values(['Count'], ascending = False)
ANGPTL3_df.head(10)
Out[78]:
Kmer Count
57 AAAAT 83
67 TAAAA 79
194 TTAAA 78
68 AAAAA 74
318 AAATA 73
54 AAATT 59
366 TTTAA 59
348 AATAA 57
396 ATTAT 53
93 TTTTA 52
In [79]:
ANGPTL3_df.tail(10)
Out[79]:
Kmer Count
784 ACACG 1
160 CGATG 1
777 ACGGA 1
169 TAGCC 1
46 CGTTG 1
44 CACGT 1
766 GGGGT 1
765 TGGGG 1
415 CGGTT 1
882 CCCCC 1

As we can see, the ANGPTL3 gene is indeed very AT-rich. The most frequent K-mers seems to contain only A and T nucleotides. In contrast, it seems that GC-rich K-mers are not very common in the ANGPTL3 gene.

CountVectorizer and Random Forest Classification

Now we are going to use the Bag Of Words model, i.e. K-mer counting, for training a simple Random Forest classifier that should learn to classify sequences to belong to Neanderthal introgressed or depleted regions. As previously, we start with reading the two fasta-files with the introgressed and depleted sequences into the memory.

In [1]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    cutoff = 8800
    my_intr_seq = str(intr.seq)[0:cutoff]
    my_depl_seq = str(depl.seq)[0:cutoff]
    
    intr_seqs.append(my_intr_seq)
    #my_intr_reverse_compliment_seq = str(Seq(my_intr_seq).reverse_complement())
    #intr_seqs.append(my_intr_reverse_compliment_seq)
    
    depl_seqs.append(my_depl_seq)
    #my_depl_reverse_compliment_seq = str(Seq(my_depl_seq).reverse_complement())
    #depl_seqs.append(my_depl_reverse_compliment_seq)
    
    e = e + 1
    if e%20000 == 0:
        print('Finished ' + str(e) + ' entries')
Finished 20000 entries
Finished 40000 entries
Finished 60000 entries

Now we will use the getKmers function that splits the sequences into K-mers. We apply this function to each sequence and concatenate the K-mers on the fly in space-delimited manner in order to build sentences out of sequences. K-mers are the words in those sentences.

In [2]:
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]
In [3]:
kmer = 5
intr_texts = [' '.join(getKmers(i, kmer)) for i in intr_seqs]
In [4]:
intr_texts[0][0:155]
Out[4]:
'AATGA ATGAC TGACA GACAT ACATT CATTA ATTAC TTACT TACTA ACTAT CTATG TATGA ATGAC TGACA GACAA ACAAT CAATT AATTT ATTTG TTTGC TTGCT TGCTT GCTTG CTTGA TTGAG TGAGA'
In [5]:
depl_texts = [' '.join(getKmers(i, kmer)) for i in depl_seqs]
In [6]:
depl_texts[0][0:155]
Out[6]:
'CTGCC TGCCA GCCAT CCATT CATTG ATTGC TTGCC TGCCT GCCTC CCTCT CTCTC TCTCC CTCCA TCCAC CCACA CACAC ACACA CACAA ACAAA CAAAT AAATA AATAC ATACA TACAC ACACA CACAT'

Thus we have built two lists (intr_texts and depl_texts) of sentences, one for introgressed sequences, the other one is for depleted sequences. One can say that e.g. intr_texts is a text with many sentences, those sentences consist of K-mers as words or tokens. The words are all of the same length and are space-delimited in each sentence. We are goint to merge the two texts into a single big list / text of sentences, and create a list of labels for each sentence where 1 corresponds to sentences coming from Neanderthal introgressed texts and 0 corresponds to sequences coming from Neanderthal depleted texts.

In [7]:
merge_texts = intr_texts + depl_texts
len(merge_texts)
Out[7]:
147468
In [8]:
import numpy as np
labels = list(np.ones(len(intr_texts))) + list(np.zeros(len(depl_texts)))
print(len(labels))
147468

Now we are going to apply a simple aka one-hot-encoding transfrom to each word in the vocabulary of size 4^kmer in order to convert the words / K-mers in each sentence into a numeric / computer-friendly form. Essentially we are building a matrix of counts for each word / K-mer where columns are the K-mers and rows are the sentences / sequences split into K-mers. The elements of the matrix are how many times each word / K-ner is found at each sentence / sequence. This can be done via scikitlearn class CountVectorizer. We will now have a look at how the transformed data looks like as well the dimensions of the transformed data matrix:

In [10]:
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.feature_extraction.text import TfidfTransformer

import warnings
warnings.filterwarnings('ignore')

cv = CountVectorizer()
X = cv.fit_transform(merge_texts)

#tfidf_transformer = TfidfTransformer()
#X = tfidf_transformer.fit_transform(X)

#tokenizer = Tokenizer()
#tokenizer.fit_on_texts(merge_texts)
#encoded_docs = tokenizer.texts_to_sequences(merge_texts)
#max_length = max([len(s.split()) for s in merge_texts])
#X = pad_sequences(encoded_docs, maxlen=max_length, padding='post')

print(X.toarray())
print('\n')
print(X.shape)
[[42 10 13 ... 17 19 29]
 [61 27 26 ... 12  8 16]
 [61 27 26 ... 12  8 16]
 ...
 [33 16 17 ... 36 20 50]
 [53  4 13 ... 10 14 15]
 [59 18 26 ... 22 15 28]]


(147468, 1024)

Now we have built the input X matrix and the list of labels to be fed into the classifier. Next, per standard, we are going to randomly split the data set X into training, X_train, and testing, X_test, sub-sets. The same random split will be simultaneously done for the labels.

In [11]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X.toarray(), labels, test_size=0.20, random_state=42)
In [12]:
print(X_train.shape)
print(X_test.shape)
(117974, 1024)
(29494, 1024)

Now we will train the Random Foresr classifier and evaluate it on the test data set.

In [13]:
from sklearn.svm import LinearSVC
from xgboost import XGBClassifier
from sklearn.naive_bayes import GaussianNB
from sklearn.linear_model import LogisticRegression
from sklearn.ensemble import RandomForestClassifier
#classifier = GaussianNB()
#classifier = LinearSVC()
#classifier = LogisticRegression()
classifier = RandomForestClassifier(n_estimators = 500)
#classifier = XGBClassifier(n_estimators = 100)
classifier.fit(X_train, y_train)
Out[13]:
RandomForestClassifier(bootstrap=True, class_weight=None, criterion='gini',
                       max_depth=None, max_features='auto', max_leaf_nodes=None,
                       min_impurity_decrease=0.0, min_impurity_split=None,
                       min_samples_leaf=1, min_samples_split=2,
                       min_weight_fraction_leaf=0.0, n_estimators=500,
                       n_jobs=None, oob_score=False, random_state=None,
                       verbose=0, warm_start=False)
In [14]:
y_pred = classifier.predict(X_test)
In [15]:
import pandas as pd
from sklearn.metrics import accuracy_score, f1_score, precision_score, recall_score
print("Confusion matrix\n")
print(pd.crosstab(pd.Series(y_test, name='Actual'), pd.Series(y_pred, name='Predicted')))
def get_metrics(y_test, y_predicted):
    accuracy = accuracy_score(y_test, y_predicted)
    precision = precision_score(y_test, y_predicted, average='weighted')
    recall = recall_score(y_test, y_predicted, average='weighted')
    f1 = f1_score(y_test, y_predicted, average='weighted')
    return accuracy, precision, recall, f1
accuracy, precision, recall, f1 = get_metrics(y_test, y_pred)
print('\n')
print("accuracy = %.3f \nprecision = %.3f \nrecall = %.3f \nf1 = %.3f" % (accuracy, precision, recall, f1))
Confusion matrix

Predicted    0.0    1.0
Actual                 
0.0        14857     56
1.0         4521  10060


accuracy = 0.845 
precision = 0.879 
recall = 0.845 
f1 = 0.841
In [19]:
import os
import pickle
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
pickle.dump(classifier, open('RF_model_Neand_Intr_vs_Depl.sav', 'wb'))

Let us now read the weights of the Random Forest classifier and display the Confusion Matrix and a final evaluation accuracy calculated on the test data set.

In [13]:
import os
import pickle
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

classifier = pickle.load(open('RF_model_Neand_Intr_vs_Depl.sav', 'rb'))
predicted_labels = classifier.predict(X_test)
In [16]:
predicted_labels
Out[16]:
array([1., 0., 1., ..., 0., 1., 0.])
In [17]:
import itertools
import matplotlib.pyplot as plt
from sklearn.metrics import confusion_matrix

plt.figure(figsize=(15,10))

cm = confusion_matrix(y_test, [np.round(i) for i in predicted_labels])
print('Confusion matrix:\n',cm)

cm = cm.astype('float') / cm.sum(axis = 1)[:, np.newaxis]

plt.imshow(cm, cmap = plt.cm.Blues)
plt.title('Normalized confusion matrix', fontsize = 20)
plt.colorbar()
plt.xlabel('True label', fontsize = 20)
plt.ylabel('Predicted label', fontsize = 20)
for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
    plt.text(j, i, format(cm[i, j], '.2f'),
             horizontalalignment = 'center', verticalalignment = 'center', fontsize = 20,
             color='white' if cm[i, j] > 0.5 else 'black')
plt.show()
Confusion matrix:
 [[14857    56]
 [ 4521 10060]]

From the Confusion Matrix for Random Forest Classifier (above) it becomes clear that the model had a very high accuracy when predicting segments of depleted Neanderthal ancestry, however performed very poorly (worse than the neural network) on classifying introgressed regions.

In [20]:
from sklearn.metrics import accuracy_score
score = accuracy_score(y_test, predicted_labels)
print("Accuracy: %.2f%%" % (score*100))
Accuracy: 84.48%
In [21]:
classifier.feature_importances_
Out[21]:
array([0.00213581, 0.00140425, 0.00127744, ..., 0.00131052, 0.0013646 ,
       0.00160065])

Surprisingly, we achieve very high accuracy with the simple Random Forest classifier using the Bag Of Words (basically K-mer frequencies) NLP model. We can display feature importances which any Machine Learning algorithm delivers, including the Random Forest:

In [16]:
names = cv.get_feature_names()
names = [i.upper() for i in names]
feature_import = sorted(zip(map(lambda x: str(round(x, 5)), classifier.feature_importances_), names), 
                        reverse = True)
feature_import[0:20]
Out[16]:
[('0.00214', 'AAAAA'),
 ('0.00169', 'CAAAA'),
 ('0.00162', 'CATTT'),
 ('0.0016', 'TTTTT'),
 ('0.00155', 'ATGCA'),
 ('0.00147', 'TGTGT'),
 ('0.00147', 'CCTTC'),
 ('0.00147', 'AAATG'),
 ('0.00146', 'TTCTG'),
 ('0.00146', 'ACACA'),
 ('0.00145', 'CTTCT'),
 ('0.00144', 'GCTTT'),
 ('0.00143', 'AAAAT'),
 ('0.00142', 'ACAAA'),
 ('0.00141', 'GTGTG'),
 ('0.00141', 'CCAGC'),
 ('0.00141', 'ATATA'),
 ('0.00141', 'AATGA'),
 ('0.0014', 'AAAAC'),
 ('0.00138', 'TCTCT')]
In [18]:
import matplotlib.pyplot as plt
plt.figure(figsize=(20,15))

importances = classifier.feature_importances_
std = np.std([tree.feature_importances_ for tree in classifier.estimators_], axis=0)
indices = np.argsort(importances)[::-1]

plt.title("Feature importances", fontsize = 20)
plt.bar(range(X_train.shape[1])[0:50], importances[indices][0:50], 
        yerr = std[indices][0:50], align = "center")
plt.xticks(rotation = 90)
plt.ylabel('Score', fontsize = 20)
plt.xticks(range(X_train.shape[1])[0:50], np.array(names)[indices][0:50])
plt.show()

We can see that the most common AT-rich K-mers turn out to be most informative for the Random Forest classifier within the Bag Of Words NLP model. Therefore, despite those AT-rich K-mers are most frequent in both Neanderthal introgressed and depleted sequences, small differences between the two types of sequences make the Random Forest accurately classify each given sequence. Let us take the ANGPTL3 sequence and make a prediction for that sequence:

In [1]:
import os
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/')
with open('ANGPTL3_sequence.txt','r') as fin:
    ANGPTL3_seq = fin.read().upper()
In [2]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    cutoff = 8800
    my_intr_seq = str(intr.seq)[0:cutoff]
    my_depl_seq = str(depl.seq)[0:cutoff]
    
    intr_seqs.append(my_intr_seq)
    #my_intr_reverse_compliment_seq = str(Seq(my_intr_seq).reverse_complement())
    #intr_seqs.append(my_intr_reverse_compliment_seq)
    
    depl_seqs.append(my_depl_seq)
    #my_depl_reverse_compliment_seq = str(Seq(my_depl_seq).reverse_complement())
    #depl_seqs.append(my_depl_reverse_compliment_seq)
    
    e = e + 1
    if e%20000 == 0:
        print('Finished ' + str(e) + ' entries')
        
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]

kmer = 5
intr_texts = [' '.join(getKmers(i, kmer)) for i in intr_seqs]
depl_texts = [' '.join(getKmers(i, kmer)) for i in depl_seqs]
Finished 20000 entries
Finished 40000 entries
Finished 60000 entries
In [5]:
import os
import pickle
from sklearn.feature_extraction.text import CountVectorizer
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

classifier = pickle.load(open('RF_model_Neand_Intr_vs_Depl.sav', 'rb'))
cv = CountVectorizer()
merge_texts = intr_texts + depl_texts
merge_texts.append(' '.join(getKmers(ANGPTL3_seq[0:cutoff], kmer)))
X_ANGPTL3 = cv.fit_transform(merge_texts)
int(classifier.predict(X_ANGPTL3[-1,:].toarray()))
Out[5]:
0
In [6]:
classifier.predict_proba(X_ANGPTL3[-1,:].toarray())
Out[6]:
array([[0.658, 0.342]])

We can see that the Random Forest classifier gives 66% probability that this gene was not inherited from Neanderthals, 0 predicted label means the ANGPTL3 sequence most likely falls within the regions of depleted Neanderthal ancestry. This is not a high probability, and we know from the confusion matrix that when we predict 0, it can be actually 1 in almost 14% of cases, so the prediction for ANGPTL3 might not be accurate enough. So we can not make a confident prediction for ANGPTL3 but perhaps there will be other genes that can be predicted to be inherited from Neanderthal. We can actually make a loop and use the trained Random Forest classifier model for making prediction for each gene.

Detecting Genes Inherited from Neanderthals

Now we will use the trained Random Forest classifier in order to make predictions for all genes in the hg19 human reference genome. For this purpose we will have to 1) select quite long genes with length > cut, 2) merge the gene sentences together with the introgressed and depleted sentences in order to process them all together through the CountVectorizer so that the sentences for prediction are prepared in identical way as the data used for training the Random Forest classifier. As usually, we will start with reading the introgressed and depleted sequences, split them into K-mers and building sentences / texts.

In [1]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    cutoff = 8800
    my_intr_seq = str(intr.seq)[0:cutoff]
    my_depl_seq = str(depl.seq)[0:cutoff]
    
    intr_seqs.append(my_intr_seq)
    depl_seqs.append(my_depl_seq)
    
    e = e + 1
    if e%20000 == 0:
        print('Finished ' + str(e) + ' entries')
        
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]

kmer = 5
intr_texts = [' '.join(getKmers(i, kmer)) for i in intr_seqs]
depl_texts = [' '.join(getKmers(i, kmer)) for i in depl_seqs]
Finished 20000 entries
Finished 40000 entries
Finished 60000 entries
In [2]:
print(len(intr_texts))
print(len(depl_texts))
73734
73734

Now we will do the same for the gene sequences, i.e. read them, split into K-mers and build sentences / texts:

In [1]:
import os
from Bio import SeqIO

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/')

gene_file = 'hg19_gene_clean.fa'

a = 0
gene_seqs = []
gene_ids = []
for gene in SeqIO.parse(gene_file, 'fasta'):
    cut = 8800
    if len(str(gene.seq)) < cut:
        continue
    s_gene = str(gene.seq)[0:cut]
    if s_gene.count('A')>0 and s_gene.count('C')>0 and s_gene.count('G')>0 and s_gene.count('T')>0:
        gene_seqs.append(s_gene)
        gene_ids.append(str(gene.id))
    a = a + 1
    if a%10000 == 0:
        print('Finished ' + str(a) + ' genes')

def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]

kmer = 5
gene_texts = [' '.join(getKmers(i, kmer)) for i in gene_seqs]
Finished 10000 genes
Finished 20000 genes
In [2]:
print(len(gene_texts))
print(len(gene_ids))
21503
21503
In [3]:
gene_ids[0:10]
Out[3]:
['chr1:14362-29370',
 'chr1:700245-714068',
 'chr1:762971-778984',
 'chr1:762971-794826',
 'chr1:763178-794826',
 'chr1:861121-879961',
 'chr1:879583-894679',
 'chr1:1017198-1051736',
 'chr1:1152288-1167447',
 'chr1:1189292-1209234']
In [4]:
import os
import pickle
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

with open('gene_ids.txt', 'w') as f:
    for item in gene_ids:
        f.write("%s\n" % item)
In [6]:
merge_texts = intr_texts + gene_texts
In [7]:
import os
import pickle
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

with open('merge_texts.txt', 'wb') as fp:
    pickle.dump(merge_texts, fp)
In [ ]:
 

Here we need to restsrt the Kernel in order to release memory, and then read the merge_texts list into the memory again:

In [1]:
import os
import pickle
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

with open ('merge_texts.txt', 'rb') as fp:
    merge_texts = pickle.load(fp)
In [5]:
import os
import pickle
from sklearn.feature_extraction.text import CountVectorizer
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

classifier = pickle.load(open('RF_model_Neand_Intr_vs_Depl.sav', 'rb'))
cv = CountVectorizer()
X_gene = cv.fit_transform(gene_texts)
gene_predictions = classifier.predict(X_gene.toarray())
gene_predictions_prob = classifier.predict_proba(X_gene.toarray())
#gene_predictions = classifier.predict(X_gene[-X_gene.shape[0]:21503,:].toarray())
#gene_predictions_prob = classifier.predict_proba(X_gene[-X_gene.shape[0]:21503,:].toarray())
In [6]:
len(gene_predictions)
Out[6]:
21503
In [7]:
X_gene.shape
Out[7]:
(21503, 1024)
In [8]:
gene_predictions
Out[8]:
array([0., 0., 0., ..., 0., 0., 0.])
In [14]:
import numpy as np
print(np.sum(gene_predictions==0))
print(np.sum(gene_predictions==1))
21026
477
In [9]:
gene_predictions_prob
Out[9]:
array([[0.674, 0.326],
       [0.694, 0.306],
       [0.592, 0.408],
       ...,
       [0.574, 0.426],
       [0.574, 0.426],
       [0.584, 0.416]])
In [10]:
gene_predictions_prob_0 = [i[0] for i in gene_predictions_prob]
In [11]:
gene_predictions_prob_1 = [i[1] for i in gene_predictions_prob]
In [23]:
import os
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
gene_ids = []
gene_symbol = []
with open('gene_ids.txt','r') as fin:
    for line in fin:
        line = line.split('\t')
        gene_ids.append(line[0])
        gene_symbol.append(line[1].rstrip())
In [31]:
import pandas as pd
gene_pred_df = pd.DataFrame({'Gene': gene_ids, 'Gene_Symbol': gene_symbol, 'Predict': gene_predictions, 
                             'Prob_0': gene_predictions_prob_0, 'Prob_1': gene_predictions_prob_1})
gene_pred_df = gene_pred_df.sort_values(['Prob_1'], ascending = False)
gene_pred_df.head(30)
Out[31]:
Gene Gene_Symbol Predict Prob_0 Prob_1
12554 chr3:19988572-20026667 RAB5A 1.0 0.032 0.968
16186 chr6:961241-1101567 LINC01622 1.0 0.046 0.954
2966 chr11:4665156-4676716 OR51E1 1.0 0.066 0.934
15270 chr5:57878871-58155222 RAB3C 1.0 0.094 0.906
10923 chr2:179694484-179914786 CCDC141 1.0 0.094 0.906
6014 chr14:101355986-101465450 MEG8 1.0 0.100 0.900
16252 chr6:8435856-8785678 LOC100506207 1.0 0.136 0.864
2440 chr10:87359312-88126250 GRID1 1.0 0.136 0.864
16251 chr6:8435856-8712526 LOC100506207 1.0 0.136 0.864
382 chr1:33231235-33240571 KIAA1522 1.0 0.154 0.846
12301 chr22:32870707-32894818 FBXO7 1.0 0.158 0.842
19128 chr8:17154306-17271040 MTMR7 1.0 0.168 0.832
4853 chr12:96043031-96067770 PGAM1P5 1.0 0.168 0.832
12597 chr3:29322803-30032809 RBMS3 1.0 0.174 0.826
12598 chr3:29322803-30051886 RBMS3 1.0 0.174 0.826
16047 chr5:169064251-169510386 DOCK2 1.0 0.176 0.824
13843 chr3:191857182-192445388 FGF12 1.0 0.178 0.822
13842 chr3:191857182-192126838 FGF12 1.0 0.178 0.822
16878 chr6:80340822-80413387 SH3BGRL2 1.0 0.180 0.820
3739 chr11:95523625-95565857 CEP57 1.0 0.182 0.818
10325 chr2:86730553-86790620 CHMP3 1.0 0.186 0.814
10326 chr2:86730553-86948245 RNF103-CHMP3 1.0 0.186 0.814
2238 chr10:61786056-61900774 ANK3 1.0 0.192 0.808
2239 chr10:61786056-62149742 ANK3 1.0 0.192 0.808
2240 chr10:61786056-62332714 ANK3 1.0 0.192 0.808
2241 chr10:61786056-62493284 ANK3 1.0 0.192 0.808
3761 chr11:102391239-102401484 MMP7 1.0 0.194 0.806
13506 chr3:147795946-147805816 LINC02032 1.0 0.200 0.800
10404 chr2:99235569-99279936 MGAT4A 1.0 0.214 0.786
10405 chr2:99235569-99347589 MGAT4A 1.0 0.214 0.786
In [30]:
gene_pred_df[gene_pred_df['Prob_1']>0.8].shape
Out[30]:
(27, 5)
In [32]:
gene_pred_df[gene_pred_df['Gene_Symbol']=="ANGPTL3"]
Out[32]:
Gene Gene_Symbol Predict Prob_0 Prob_1
634 chr1:63063158-63071976 ANGPTL3 0.0 0.658 0.342
In [33]:
gene_pred_df.tail(30)
Out[33]:
Gene Gene_Symbol Predict Prob_0 Prob_1
9274 chr19:42901280-42912604 LOC101930071 0.0 0.876 0.124
18 chr1:1413495-1431584 ATAD3B 0.0 0.876 0.124
18318 chr7:66147078-66276448 RABGEF1 0.0 0.876 0.124
9412 chr19:48958964-48969367 KCNJ14 0.0 0.878 0.122
12785 chr3:47537130-47555199 ELP6 0.0 0.880 0.120
4110 chr12:6420099-6437672 PLEKHG6 0.0 0.880 0.120
1149 chr1:155204239-155214653 GBA 0.0 0.884 0.116
1744 chr1:235330210-235491532 ARID4B 0.0 0.884 0.116
1743 chr1:235330210-235490802 ARID4B 0.0 0.884 0.116
13 chr1:1288069-1298921 MXRA8 0.0 0.884 0.116
3471 chr11:64948686-64979477 CAPN1 0.0 0.886 0.114
5137 chr12:123259056-123311927 CCDC62 0.0 0.886 0.114
17953 chr7:5965777-6010314 RSPH10B 0.0 0.888 0.112
19927 chr8:145597704-145618453 ADCK5 0.0 0.888 0.112
6054 chr14:105956192-105965585 C14orf80 0.0 0.888 0.112
17952 chr7:5965777-6010314 RSPH10B 0.0 0.888 0.112
8047 chr17:73720776-73753899 ITGB4 0.0 0.890 0.110
19920 chr8:145106167-145115606 OPLAH 0.0 0.892 0.108
11708 chr20:32319566-32380075 ZNF341 0.0 0.894 0.106
8634 chr19:496490-505343 MADCAM1 0.0 0.898 0.102
1759 chr1:236558716-236648008 EDARADD 0.0 0.904 0.096
20487 chr9:131857073-131873077 CRAT 0.0 0.906 0.094
6935 chr16:66638228-66647795 CMTM3 0.0 0.906 0.094
17725 chr6_mcf_hap5:3144866-3154459 LSM2 0.0 0.908 0.092
17803 chr6_qbl_hap6:3058814-3068398 LSM2 0.0 0.908 0.092
17608 chr6_dbb_hap3:3050751-3060344 LSM2 0.0 0.910 0.090
6055 chr14:105956520-105965585 C14orf80 0.0 0.910 0.090
17513 chr6_cox_hap2:3274737-3284332 LSM2 0.0 0.910 0.090
16482 chr6:31765169-31774761 LSM2 0.0 0.910 0.090
19906 chr8:144873090-144897549 SCRIB 0.0 0.916 0.084
In [35]:
gene_pred_df.to_csv('Neanderthal_Genes.txt', header=True, index = False, sep = "\t")

Multilayer Perceptron for Neanderthal Introgressed vs. Depleted Sequence Classification

Here we will still use the Bag of Words model but implement a Multilayer Perceptron model for classification of sequences coming from Neanderthal introgressed vs. depleted Neanderthal ancestry regions. We again start with reading the sequences from the two fasta-files (introgressed and depleted regions), split them into words / K-mers and build sentences out of them.

In [1]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    cutoff = 10000
    my_intr_seq = str(intr.seq)[0:cutoff]
    my_depl_seq = str(depl.seq)[0:cutoff]
    
    intr_seqs.append(my_intr_seq)
    #my_intr_reverse_compliment_seq = str(Seq(my_intr_seq).reverse_complement())
    #intr_seqs.append(my_intr_reverse_compliment_seq)
    
    depl_seqs.append(my_depl_seq)
    #my_depl_reverse_compliment_seq = str(Seq(my_depl_seq).reverse_complement())
    #depl_seqs.append(my_depl_reverse_compliment_seq)
    
    e = e + 1
    if e%20000 == 0:
        print('Finished ' + str(e) + ' entries')
        
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]

kmer = 5
intr_texts = [' '.join(getKmers(i, kmer)) for i in intr_seqs]
depl_texts = [' '.join(getKmers(i, kmer)) for i in depl_seqs]
Finished 20000 entries
Finished 40000 entries
Finished 60000 entries

As we did previously, we will use CountVectorizer in order to convert words / K-mers into numeric representation, the elements of the obtained matrix X show how often each word / K-mer occurs in each sentence / sequence.

In [2]:
merge_texts = intr_texts + depl_texts
len(merge_texts)
Out[2]:
147468
In [3]:
import numpy as np
labels = list(np.ones(len(intr_texts))) + list(np.zeros(len(depl_texts)))
print(len(labels))
147468
In [5]:
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.feature_extraction.text import TfidfTransformer

import warnings
warnings.filterwarnings('ignore')

cv = CountVectorizer()
X = cv.fit_transform(merge_texts)

#tfidf_transformer = TfidfTransformer()
#X = tfidf_transformer.fit_transform(X)

#tokenizer = Tokenizer()
#tokenizer.fit_on_texts(merge_texts)
#X = tokenizer.texts_to_matrix(merge_texts, mode = 'freq')

#encoded_docs = tokenizer.texts_to_sequences(merge_texts)
#max_length = max([len(s.split()) for s in merge_texts])
#X = pad_sequences(encoded_docs, maxlen = max_length, padding = 'post')

X = np.int32(X.toarray())

print(X)
print('\n')
print(X.shape)
[[47 11 15 ... 29 25 49]
 [61 27 27 ... 13 10 16]
 [61 27 27 ... 13 10 16]
 ...
 [43 23 22 ... 40 24 52]
 [54  4 13 ... 10 14 15]
 [63 20 27 ... 27 17 34]]


(147468, 1024)

Again, we split the data set into training, X_train, and testing, X_test, subsets:

In [6]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, labels, test_size = 0.20, random_state = 42)
print(X_train.shape)
print(X_test.shape)
(117974, 1024)
(29494, 1024)

Now the data is prepared to be fed into the Multilayer Perceptron model for classification of sequences coming from Neandertal introgressed vs. depleted Neanderthal ancestry regions.

In [44]:
from keras.models import Sequential
from keras.regularizers import l2, l1
from keras.callbacks import ModelCheckpoint
from keras.optimizers import SGD, Adam, Adadelta
from keras.layers import Conv1D, Dense, MaxPooling1D, Flatten, Dropout, Embedding, Activation

model = Sequential()
model.add(Dense(3000, input_shape=(X.shape[1],), activation = 'sigmoid', 
                kernel_regularizer = l1(0.00001)))
#model.add(Dense(1000, activation = 'sigmoid'))
#model.add(Dense(100, activation = 'sigmoid'))
#model.add(Dense(10, activation = 'sigmoid'))
#model.add(Dropout(0.5))
model.add(Dense(1, activation = 'sigmoid'))

epochs = 200
lrate = 0.0001
#decay = lrate / epochs
#sgd = SGD(lr = lrate, momentum = 0.99, nesterov = True)
sgd = SGD(lr = lrate, momentum = 0.9, nesterov = False)
#sgd = SGD(lr = lrate, momentum = 0.9, decay = decay, nesterov = False)
#model.compile(loss = 'binary_crossentropy', optimizer = Adam(lr = lrate), metrics = ['binary_accuracy'])
#model.compile(loss = 'binary_crossentropy', optimizer = 'adam', metrics = ['binary_accuracy'])
model.compile(loss = 'binary_crossentropy', optimizer = sgd, metrics = ['binary_accuracy'])
checkpoint = ModelCheckpoint("weights.best.hdf5", monitor = 'val_binary_accuracy', verbose = 1, 
                             save_best_only = True, mode = 'max')

model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_17 (Dense)             (None, 3000)              3075000   
_________________________________________________________________
dense_18 (Dense)             (None, 1)                 3001      
=================================================================
Total params: 3,078,001
Trainable params: 3,078,001
Non-trainable params: 0
_________________________________________________________________
In [46]:
import warnings
warnings.filterwarnings('ignore')

history = model.fit(X_train, y_train, 
                    epochs = epochs, verbose = 1, validation_split = 0.2, batch_size = 32, shuffle = True, 
                    callbacks = [checkpoint])
Train on 94379 samples, validate on 23595 samples
Epoch 1/200
94379/94379 [==============================] - 115s 1ms/step - loss: 1.2779 - binary_accuracy: 0.5527 - val_loss: 1.2716 - val_binary_accuracy: 0.5641

Epoch 00001: val_binary_accuracy improved from -inf to 0.56414, saving model to weights.best.hdf5
Epoch 2/200
94379/94379 [==============================] - 116s 1ms/step - loss: 1.2655 - binary_accuracy: 0.5768 - val_loss: 1.2732 - val_binary_accuracy: 0.5548

Epoch 00002: val_binary_accuracy did not improve from 0.56414
Epoch 3/200
94379/94379 [==============================] - 118s 1ms/step - loss: 1.2584 - binary_accuracy: 0.5889 - val_loss: 1.2709 - val_binary_accuracy: 0.5630

Epoch 00003: val_binary_accuracy did not improve from 0.56414
Epoch 4/200
94379/94379 [==============================] - 117s 1ms/step - loss: 1.2533 - binary_accuracy: 0.5975 - val_loss: 1.2838 - val_binary_accuracy: 0.5494

Epoch 00004: val_binary_accuracy did not improve from 0.56414
Epoch 5/200
94379/94379 [==============================] - 124s 1ms/step - loss: 1.2487 - binary_accuracy: 0.6020 - val_loss: 1.2561 - val_binary_accuracy: 0.5850

Epoch 00005: val_binary_accuracy improved from 0.56414 to 0.58500, saving model to weights.best.hdf5
Epoch 6/200
94379/94379 [==============================] - 116s 1ms/step - loss: 1.2434 - binary_accuracy: 0.6105 - val_loss: 1.2566 - val_binary_accuracy: 0.5854

Epoch 00006: val_binary_accuracy improved from 0.58500 to 0.58538, saving model to weights.best.hdf5
Epoch 7/200
94379/94379 [==============================] - 117s 1ms/step - loss: 1.2386 - binary_accuracy: 0.6175 - val_loss: 1.2505 - val_binary_accuracy: 0.5928

Epoch 00007: val_binary_accuracy improved from 0.58538 to 0.59275, saving model to weights.best.hdf5
Epoch 8/200
94379/94379 [==============================] - 116s 1ms/step - loss: 1.2337 - binary_accuracy: 0.6239 - val_loss: 1.2555 - val_binary_accuracy: 0.5828

Epoch 00008: val_binary_accuracy did not improve from 0.59275
Epoch 9/200
94379/94379 [==============================] - 119s 1ms/step - loss: 1.2289 - binary_accuracy: 0.6294 - val_loss: 1.2440 - val_binary_accuracy: 0.6007

Epoch 00009: val_binary_accuracy improved from 0.59275 to 0.60068, saving model to weights.best.hdf5
Epoch 10/200
94379/94379 [==============================] - 119s 1ms/step - loss: 1.2242 - binary_accuracy: 0.6356 - val_loss: 1.2537 - val_binary_accuracy: 0.5825

Epoch 00010: val_binary_accuracy did not improve from 0.60068
Epoch 11/200
94379/94379 [==============================] - 120s 1ms/step - loss: 1.2202 - binary_accuracy: 0.6393 - val_loss: 1.2369 - val_binary_accuracy: 0.6099

Epoch 00011: val_binary_accuracy improved from 0.60068 to 0.60992, saving model to weights.best.hdf5
Epoch 12/200
94379/94379 [==============================] - 119s 1ms/step - loss: 1.2150 - binary_accuracy: 0.6444 - val_loss: 1.2386 - val_binary_accuracy: 0.6047

Epoch 00012: val_binary_accuracy did not improve from 0.60992
Epoch 13/200
94379/94379 [==============================] - 120s 1ms/step - loss: 1.2108 - binary_accuracy: 0.6471 - val_loss: 1.2357 - val_binary_accuracy: 0.6083

Epoch 00013: val_binary_accuracy did not improve from 0.60992
Epoch 14/200
94379/94379 [==============================] - 120s 1ms/step - loss: 1.2054 - binary_accuracy: 0.6541 - val_loss: 1.2373 - val_binary_accuracy: 0.6049

Epoch 00014: val_binary_accuracy did not improve from 0.60992
Epoch 15/200
94379/94379 [==============================] - 118s 1ms/step - loss: 1.2001 - binary_accuracy: 0.6593 - val_loss: 1.2276 - val_binary_accuracy: 0.6132

Epoch 00015: val_binary_accuracy improved from 0.60992 to 0.61322, saving model to weights.best.hdf5
Epoch 16/200
94379/94379 [==============================] - 122s 1ms/step - loss: 1.1944 - binary_accuracy: 0.6665 - val_loss: 1.2349 - val_binary_accuracy: 0.6069

Epoch 00016: val_binary_accuracy did not improve from 0.61322
Epoch 17/200
94379/94379 [==============================] - 119s 1ms/step - loss: 1.1901 - binary_accuracy: 0.6708 - val_loss: 1.2182 - val_binary_accuracy: 0.6273

Epoch 00017: val_binary_accuracy improved from 0.61322 to 0.62734, saving model to weights.best.hdf5
Epoch 18/200
94379/94379 [==============================] - 121s 1ms/step - loss: 1.1834 - binary_accuracy: 0.6774 - val_loss: 1.2226 - val_binary_accuracy: 0.6166

Epoch 00018: val_binary_accuracy did not improve from 0.62734
Epoch 19/200
94379/94379 [==============================] - 121s 1ms/step - loss: 1.1787 - binary_accuracy: 0.6820 - val_loss: 1.2223 - val_binary_accuracy: 0.6170

Epoch 00019: val_binary_accuracy did not improve from 0.62734
Epoch 20/200
94379/94379 [==============================] - 120s 1ms/step - loss: 1.1726 - binary_accuracy: 0.6866 - val_loss: 1.2118 - val_binary_accuracy: 0.6346

Epoch 00020: val_binary_accuracy improved from 0.62734 to 0.63463, saving model to weights.best.hdf5
Epoch 21/200
94379/94379 [==============================] - 121s 1ms/step - loss: 1.1670 - binary_accuracy: 0.6910 - val_loss: 1.2099 - val_binary_accuracy: 0.6305

Epoch 00021: val_binary_accuracy did not improve from 0.63463
Epoch 22/200
94379/94379 [==============================] - 119s 1ms/step - loss: 1.1603 - binary_accuracy: 0.6989 - val_loss: 1.2291 - val_binary_accuracy: 0.5993

Epoch 00022: val_binary_accuracy did not improve from 0.63463
Epoch 23/200
94379/94379 [==============================] - 119s 1ms/step - loss: 1.1547 - binary_accuracy: 0.7042 - val_loss: 1.2004 - val_binary_accuracy: 0.6475

Epoch 00023: val_binary_accuracy improved from 0.63463 to 0.64751, saving model to weights.best.hdf5
Epoch 24/200
94379/94379 [==============================] - 120s 1ms/step - loss: 1.1471 - binary_accuracy: 0.7107 - val_loss: 1.2050 - val_binary_accuracy: 0.6309

Epoch 00024: val_binary_accuracy did not improve from 0.64751
Epoch 25/200
94379/94379 [==============================] - 120s 1ms/step - loss: 1.1414 - binary_accuracy: 0.7144 - val_loss: 1.1922 - val_binary_accuracy: 0.6550

Epoch 00025: val_binary_accuracy improved from 0.64751 to 0.65497, saving model to weights.best.hdf5
Epoch 26/200
94379/94379 [==============================] - 121s 1ms/step - loss: 1.1352 - binary_accuracy: 0.7213 - val_loss: 1.2075 - val_binary_accuracy: 0.6359

Epoch 00026: val_binary_accuracy did not improve from 0.65497
Epoch 27/200
94379/94379 [==============================] - 121s 1ms/step - loss: 1.1276 - binary_accuracy: 0.7268 - val_loss: 1.1978 - val_binary_accuracy: 0.6373

Epoch 00027: val_binary_accuracy did not improve from 0.65497
Epoch 28/200
94379/94379 [==============================] - 120s 1ms/step - loss: 1.1222 - binary_accuracy: 0.7293 - val_loss: 1.1868 - val_binary_accuracy: 0.6444

Epoch 00028: val_binary_accuracy did not improve from 0.65497
Epoch 29/200
94379/94379 [==============================] - 121s 1ms/step - loss: 1.1145 - binary_accuracy: 0.7373 - val_loss: 1.1867 - val_binary_accuracy: 0.6589

Epoch 00029: val_binary_accuracy improved from 0.65497 to 0.65887, saving model to weights.best.hdf5
Epoch 30/200
94379/94379 [==============================] - 121s 1ms/step - loss: 1.1072 - binary_accuracy: 0.7438 - val_loss: 1.1750 - val_binary_accuracy: 0.6645

Epoch 00030: val_binary_accuracy improved from 0.65887 to 0.66451, saving model to weights.best.hdf5
Epoch 31/200
94379/94379 [==============================] - 123s 1ms/step - loss: 1.1003 - binary_accuracy: 0.7491 - val_loss: 1.1709 - val_binary_accuracy: 0.6736

Epoch 00031: val_binary_accuracy improved from 0.66451 to 0.67362, saving model to weights.best.hdf5
Epoch 32/200
94379/94379 [==============================] - 124s 1ms/step - loss: 1.0938 - binary_accuracy: 0.7526 - val_loss: 1.1800 - val_binary_accuracy: 0.6647

Epoch 00032: val_binary_accuracy did not improve from 0.67362
Epoch 33/200
94379/94379 [==============================] - 123s 1ms/step - loss: 1.0864 - binary_accuracy: 0.7593 - val_loss: 1.1910 - val_binary_accuracy: 0.6270

Epoch 00033: val_binary_accuracy did not improve from 0.67362
Epoch 34/200
94379/94379 [==============================] - 124s 1ms/step - loss: 1.0784 - binary_accuracy: 0.7646 - val_loss: 1.1720 - val_binary_accuracy: 0.6622

Epoch 00034: val_binary_accuracy did not improve from 0.67362
Epoch 35/200
94379/94379 [==============================] - 112s 1ms/step - loss: 1.0720 - binary_accuracy: 0.7689 - val_loss: 1.1541 - val_binary_accuracy: 0.6855

Epoch 00035: val_binary_accuracy improved from 0.67362 to 0.68553, saving model to weights.best.hdf5
Epoch 36/200
94379/94379 [==============================] - 115s 1ms/step - loss: 1.0611 - binary_accuracy: 0.7787 - val_loss: 1.1516 - val_binary_accuracy: 0.6812

Epoch 00036: val_binary_accuracy did not improve from 0.68553
Epoch 37/200
94379/94379 [==============================] - 113s 1ms/step - loss: 1.0545 - binary_accuracy: 0.7840 - val_loss: 1.1474 - val_binary_accuracy: 0.6885

Epoch 00037: val_binary_accuracy improved from 0.68553 to 0.68849, saving model to weights.best.hdf5
Epoch 38/200
94379/94379 [==============================] - 115s 1ms/step - loss: 1.0481 - binary_accuracy: 0.7855 - val_loss: 1.1465 - val_binary_accuracy: 0.6804

Epoch 00038: val_binary_accuracy did not improve from 0.68849
Epoch 39/200
94379/94379 [==============================] - 113s 1ms/step - loss: 1.0405 - binary_accuracy: 0.7906 - val_loss: 1.1392 - val_binary_accuracy: 0.6979

Epoch 00039: val_binary_accuracy improved from 0.68849 to 0.69790, saving model to weights.best.hdf5
Epoch 40/200
94379/94379 [==============================] - 115s 1ms/step - loss: 1.0324 - binary_accuracy: 0.7967 - val_loss: 1.1676 - val_binary_accuracy: 0.6478

Epoch 00040: val_binary_accuracy did not improve from 0.69790
Epoch 41/200
94379/94379 [==============================] - 115s 1ms/step - loss: 1.0258 - binary_accuracy: 0.7997 - val_loss: 1.1334 - val_binary_accuracy: 0.7064

Epoch 00041: val_binary_accuracy improved from 0.69790 to 0.70642, saving model to weights.best.hdf5
Epoch 42/200
94379/94379 [==============================] - 113s 1ms/step - loss: 1.0145 - binary_accuracy: 0.8093 - val_loss: 1.1520 - val_binary_accuracy: 0.6918

Epoch 00042: val_binary_accuracy did not improve from 0.70642
Epoch 43/200
94379/94379 [==============================] - 114s 1ms/step - loss: 1.0068 - binary_accuracy: 0.8151 - val_loss: 1.1267 - val_binary_accuracy: 0.7062

Epoch 00043: val_binary_accuracy did not improve from 0.70642
Epoch 44/200
94379/94379 [==============================] - 116s 1ms/step - loss: 1.0027 - binary_accuracy: 0.8144 - val_loss: 1.1403 - val_binary_accuracy: 0.6995

Epoch 00044: val_binary_accuracy did not improve from 0.70642
Epoch 45/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.9932 - binary_accuracy: 0.8214 - val_loss: 1.1303 - val_binary_accuracy: 0.6856

Epoch 00045: val_binary_accuracy did not improve from 0.70642
Epoch 46/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.9854 - binary_accuracy: 0.8269 - val_loss: 1.1340 - val_binary_accuracy: 0.7052

Epoch 00046: val_binary_accuracy did not improve from 0.70642
Epoch 47/200
94379/94379 [==============================] - 117s 1ms/step - loss: 0.9767 - binary_accuracy: 0.8326 - val_loss: 1.1442 - val_binary_accuracy: 0.6677

Epoch 00047: val_binary_accuracy did not improve from 0.70642
Epoch 48/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.9667 - binary_accuracy: 0.8402 - val_loss: 1.2314 - val_binary_accuracy: 0.6449

Epoch 00048: val_binary_accuracy did not improve from 0.70642
Epoch 49/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.9621 - binary_accuracy: 0.8407 - val_loss: 1.1021 - val_binary_accuracy: 0.7265

Epoch 00049: val_binary_accuracy improved from 0.70642 to 0.72647, saving model to weights.best.hdf5
Epoch 50/200
94379/94379 [==============================] - 117s 1ms/step - loss: 0.9554 - binary_accuracy: 0.8414 - val_loss: 1.1043 - val_binary_accuracy: 0.7099

Epoch 00050: val_binary_accuracy did not improve from 0.72647
Epoch 51/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.9454 - binary_accuracy: 0.8513 - val_loss: 1.1288 - val_binary_accuracy: 0.7095

Epoch 00051: val_binary_accuracy did not improve from 0.72647
Epoch 52/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.9361 - binary_accuracy: 0.8570 - val_loss: 1.1380 - val_binary_accuracy: 0.6682

Epoch 00052: val_binary_accuracy did not improve from 0.72647
Epoch 53/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.9287 - binary_accuracy: 0.8609 - val_loss: 1.1704 - val_binary_accuracy: 0.6476

Epoch 00053: val_binary_accuracy did not improve from 0.72647
Epoch 54/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.9186 - binary_accuracy: 0.8687 - val_loss: 1.0916 - val_binary_accuracy: 0.7276

Epoch 00054: val_binary_accuracy improved from 0.72647 to 0.72761, saving model to weights.best.hdf5
Epoch 55/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.9123 - binary_accuracy: 0.8712 - val_loss: 1.0870 - val_binary_accuracy: 0.7315

Epoch 00055: val_binary_accuracy improved from 0.72761 to 0.73151, saving model to weights.best.hdf5
Epoch 56/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.9070 - binary_accuracy: 0.8713 - val_loss: 1.0829 - val_binary_accuracy: 0.7278

Epoch 00056: val_binary_accuracy did not improve from 0.73151
Epoch 57/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.8987 - binary_accuracy: 0.8771 - val_loss: 1.1228 - val_binary_accuracy: 0.7179

Epoch 00057: val_binary_accuracy did not improve from 0.73151
Epoch 58/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.8911 - binary_accuracy: 0.8812 - val_loss: 1.0709 - val_binary_accuracy: 0.7404

Epoch 00058: val_binary_accuracy improved from 0.73151 to 0.74041, saving model to weights.best.hdf5
Epoch 59/200
94379/94379 [==============================] - 117s 1ms/step - loss: 0.8841 - binary_accuracy: 0.8836 - val_loss: 1.0877 - val_binary_accuracy: 0.7419

Epoch 00059: val_binary_accuracy improved from 0.74041 to 0.74189, saving model to weights.best.hdf5
Epoch 60/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.8762 - binary_accuracy: 0.8886 - val_loss: 1.0696 - val_binary_accuracy: 0.7349

Epoch 00060: val_binary_accuracy did not improve from 0.74189
Epoch 61/200
94379/94379 [==============================] - 117s 1ms/step - loss: 0.8681 - binary_accuracy: 0.8928 - val_loss: 1.0701 - val_binary_accuracy: 0.7513

Epoch 00061: val_binary_accuracy improved from 0.74189 to 0.75135, saving model to weights.best.hdf5
Epoch 62/200
94379/94379 [==============================] - 117s 1ms/step - loss: 0.8635 - binary_accuracy: 0.8954 - val_loss: 1.1469 - val_binary_accuracy: 0.7147

Epoch 00062: val_binary_accuracy did not improve from 0.75135
Epoch 63/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.8561 - binary_accuracy: 0.8982 - val_loss: 1.0643 - val_binary_accuracy: 0.7456

Epoch 00063: val_binary_accuracy did not improve from 0.75135
Epoch 64/200
94379/94379 [==============================] - 118s 1ms/step - loss: 0.8485 - binary_accuracy: 0.9021 - val_loss: 1.0687 - val_binary_accuracy: 0.7285

Epoch 00064: val_binary_accuracy did not improve from 0.75135
Epoch 65/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.8431 - binary_accuracy: 0.9038 - val_loss: 1.0565 - val_binary_accuracy: 0.7471

Epoch 00065: val_binary_accuracy did not improve from 0.75135
Epoch 66/200
94379/94379 [==============================] - 117s 1ms/step - loss: 0.8367 - binary_accuracy: 0.9078 - val_loss: 1.0615 - val_binary_accuracy: 0.7327

Epoch 00066: val_binary_accuracy did not improve from 0.75135
Epoch 67/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.8265 - binary_accuracy: 0.9141 - val_loss: 1.1171 - val_binary_accuracy: 0.6826

Epoch 00067: val_binary_accuracy did not improve from 0.75135
Epoch 68/200
94379/94379 [==============================] - 117s 1ms/step - loss: 0.8196 - binary_accuracy: 0.9177 - val_loss: 1.0931 - val_binary_accuracy: 0.7016

Epoch 00068: val_binary_accuracy did not improve from 0.75135
Epoch 69/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.8132 - binary_accuracy: 0.9191 - val_loss: 1.0983 - val_binary_accuracy: 0.6968

Epoch 00069: val_binary_accuracy did not improve from 0.75135
Epoch 70/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.8081 - binary_accuracy: 0.9213 - val_loss: 1.0505 - val_binary_accuracy: 0.7399

Epoch 00070: val_binary_accuracy did not improve from 0.75135
Epoch 71/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.7999 - binary_accuracy: 0.9260 - val_loss: 1.0672 - val_binary_accuracy: 0.7626

Epoch 00071: val_binary_accuracy improved from 0.75135 to 0.76262, saving model to weights.best.hdf5
Epoch 72/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.7947 - binary_accuracy: 0.9275 - val_loss: 1.0431 - val_binary_accuracy: 0.7462

Epoch 00072: val_binary_accuracy did not improve from 0.76262
Epoch 73/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.7885 - binary_accuracy: 0.9316 - val_loss: 1.0353 - val_binary_accuracy: 0.7592

Epoch 00073: val_binary_accuracy did not improve from 0.76262
Epoch 74/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.7840 - binary_accuracy: 0.9321 - val_loss: 1.0324 - val_binary_accuracy: 0.7570

Epoch 00074: val_binary_accuracy did not improve from 0.76262
Epoch 75/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.7749 - binary_accuracy: 0.9377 - val_loss: 1.0326 - val_binary_accuracy: 0.7698

Epoch 00075: val_binary_accuracy improved from 0.76262 to 0.76982, saving model to weights.best.hdf5
Epoch 76/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.7684 - binary_accuracy: 0.9413 - val_loss: 1.0988 - val_binary_accuracy: 0.6978

Epoch 00076: val_binary_accuracy did not improve from 0.76982
Epoch 77/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.7618 - binary_accuracy: 0.9444 - val_loss: 1.0248 - val_binary_accuracy: 0.7761

Epoch 00077: val_binary_accuracy improved from 0.76982 to 0.77605, saving model to weights.best.hdf5
Epoch 78/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.7572 - binary_accuracy: 0.9454 - val_loss: 1.1055 - val_binary_accuracy: 0.6997

Epoch 00078: val_binary_accuracy did not improve from 0.77605
Epoch 79/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.7513 - binary_accuracy: 0.9477 - val_loss: 1.0335 - val_binary_accuracy: 0.7781

Epoch 00079: val_binary_accuracy improved from 0.77605 to 0.77813, saving model to weights.best.hdf5
Epoch 80/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.7450 - binary_accuracy: 0.9507 - val_loss: 1.0189 - val_binary_accuracy: 0.7720

Epoch 00080: val_binary_accuracy did not improve from 0.77813
Epoch 81/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.7383 - binary_accuracy: 0.9536 - val_loss: 1.0171 - val_binary_accuracy: 0.7775

Epoch 00081: val_binary_accuracy did not improve from 0.77813
Epoch 82/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.7371 - binary_accuracy: 0.9530 - val_loss: 1.1029 - val_binary_accuracy: 0.6969

Epoch 00082: val_binary_accuracy did not improve from 0.77813
Epoch 83/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.7284 - binary_accuracy: 0.9570 - val_loss: 1.0072 - val_binary_accuracy: 0.7828

Epoch 00083: val_binary_accuracy improved from 0.77813 to 0.78279, saving model to weights.best.hdf5
Epoch 84/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.7236 - binary_accuracy: 0.9594 - val_loss: 1.0403 - val_binary_accuracy: 0.7817

Epoch 00084: val_binary_accuracy did not improve from 0.78279
Epoch 85/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.7186 - binary_accuracy: 0.9610 - val_loss: 1.0176 - val_binary_accuracy: 0.7875

Epoch 00085: val_binary_accuracy improved from 0.78279 to 0.78754, saving model to weights.best.hdf5
Epoch 86/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.7119 - binary_accuracy: 0.9650 - val_loss: 1.0114 - val_binary_accuracy: 0.7653

Epoch 00086: val_binary_accuracy did not improve from 0.78754
Epoch 87/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.7047 - binary_accuracy: 0.9690 - val_loss: 1.0119 - val_binary_accuracy: 0.7884

Epoch 00087: val_binary_accuracy improved from 0.78754 to 0.78843, saving model to weights.best.hdf5
Epoch 88/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.7035 - binary_accuracy: 0.9678 - val_loss: 1.0128 - val_binary_accuracy: 0.7659

Epoch 00088: val_binary_accuracy did not improve from 0.78843
Epoch 89/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6993 - binary_accuracy: 0.9692 - val_loss: 1.0112 - val_binary_accuracy: 0.7901

Epoch 00089: val_binary_accuracy improved from 0.78843 to 0.79008, saving model to weights.best.hdf5
Epoch 90/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6931 - binary_accuracy: 0.9718 - val_loss: 1.0089 - val_binary_accuracy: 0.7714

Epoch 00090: val_binary_accuracy did not improve from 0.79008
Epoch 91/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.6883 - binary_accuracy: 0.9731 - val_loss: 1.0268 - val_binary_accuracy: 0.7486

Epoch 00091: val_binary_accuracy did not improve from 0.79008
Epoch 92/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.6828 - binary_accuracy: 0.9749 - val_loss: 1.0579 - val_binary_accuracy: 0.7292

Epoch 00092: val_binary_accuracy did not improve from 0.79008
Epoch 93/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6813 - binary_accuracy: 0.9743 - val_loss: 1.0136 - val_binary_accuracy: 0.7630

Epoch 00093: val_binary_accuracy did not improve from 0.79008
Epoch 94/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.6764 - binary_accuracy: 0.9767 - val_loss: 1.0435 - val_binary_accuracy: 0.7418

Epoch 00094: val_binary_accuracy did not improve from 0.79008
Epoch 95/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6711 - binary_accuracy: 0.9783 - val_loss: 0.9992 - val_binary_accuracy: 0.7797

Epoch 00095: val_binary_accuracy did not improve from 0.79008
Epoch 96/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6663 - binary_accuracy: 0.9803 - val_loss: 1.0006 - val_binary_accuracy: 0.7761

Epoch 00096: val_binary_accuracy did not improve from 0.79008
Epoch 97/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6629 - binary_accuracy: 0.9814 - val_loss: 1.0220 - val_binary_accuracy: 0.7534

Epoch 00097: val_binary_accuracy did not improve from 0.79008
Epoch 98/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.6589 - binary_accuracy: 0.9825 - val_loss: 0.9896 - val_binary_accuracy: 0.7997

Epoch 00098: val_binary_accuracy improved from 0.79008 to 0.79966, saving model to weights.best.hdf5
Epoch 99/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.6535 - binary_accuracy: 0.9848 - val_loss: 1.0100 - val_binary_accuracy: 0.7630

Epoch 00099: val_binary_accuracy did not improve from 0.79966
Epoch 100/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.6502 - binary_accuracy: 0.9853 - val_loss: 1.0008 - val_binary_accuracy: 0.8023

Epoch 00100: val_binary_accuracy improved from 0.79966 to 0.80233, saving model to weights.best.hdf5
Epoch 101/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6468 - binary_accuracy: 0.9863 - val_loss: 1.0352 - val_binary_accuracy: 0.7443

Epoch 00101: val_binary_accuracy did not improve from 0.80233
Epoch 102/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6455 - binary_accuracy: 0.9857 - val_loss: 1.0505 - val_binary_accuracy: 0.7341

Epoch 00102: val_binary_accuracy did not improve from 0.80233
Epoch 103/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.6418 - binary_accuracy: 0.9870 - val_loss: 1.0382 - val_binary_accuracy: 0.7423

Epoch 00103: val_binary_accuracy did not improve from 0.80233
Epoch 104/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6342 - binary_accuracy: 0.9901 - val_loss: 1.0129 - val_binary_accuracy: 0.7584

Epoch 00104: val_binary_accuracy did not improve from 0.80233
Epoch 105/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6329 - binary_accuracy: 0.9900 - val_loss: 1.0402 - val_binary_accuracy: 0.7402

Epoch 00105: val_binary_accuracy did not improve from 0.80233
Epoch 106/200
94379/94379 [==============================] - 111s 1ms/step - loss: 0.6279 - binary_accuracy: 0.9914 - val_loss: 0.9977 - val_binary_accuracy: 0.7734

Epoch 00106: val_binary_accuracy did not improve from 0.80233
Epoch 107/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.6263 - binary_accuracy: 0.9911 - val_loss: 0.9847 - val_binary_accuracy: 0.7816

Epoch 00107: val_binary_accuracy did not improve from 0.80233
Epoch 108/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.6213 - binary_accuracy: 0.9928 - val_loss: 0.9866 - val_binary_accuracy: 0.8095

Epoch 00108: val_binary_accuracy improved from 0.80233 to 0.80945, saving model to weights.best.hdf5
Epoch 109/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.6177 - binary_accuracy: 0.9938 - val_loss: 1.0084 - val_binary_accuracy: 0.8059

Epoch 00109: val_binary_accuracy did not improve from 0.80945
Epoch 110/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.6170 - binary_accuracy: 0.9932 - val_loss: 0.9815 - val_binary_accuracy: 0.7904

Epoch 00110: val_binary_accuracy did not improve from 0.80945
Epoch 111/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6127 - binary_accuracy: 0.9945 - val_loss: 0.9908 - val_binary_accuracy: 0.7768

Epoch 00111: val_binary_accuracy did not improve from 0.80945
Epoch 112/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.6099 - binary_accuracy: 0.9948 - val_loss: 0.9735 - val_binary_accuracy: 0.7961

Epoch 00112: val_binary_accuracy did not improve from 0.80945
Epoch 113/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.6092 - binary_accuracy: 0.9938 - val_loss: 1.0487 - val_binary_accuracy: 0.7355

Epoch 00113: val_binary_accuracy did not improve from 0.80945
Epoch 114/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.6047 - binary_accuracy: 0.9953 - val_loss: 0.9813 - val_binary_accuracy: 0.8104

Epoch 00114: val_binary_accuracy improved from 0.80945 to 0.81038, saving model to weights.best.hdf5
Epoch 115/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.6009 - binary_accuracy: 0.9960 - val_loss: 0.9782 - val_binary_accuracy: 0.7925

Epoch 00115: val_binary_accuracy did not improve from 0.81038
Epoch 116/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5991 - binary_accuracy: 0.9964 - val_loss: 0.9763 - val_binary_accuracy: 0.8074

Epoch 00116: val_binary_accuracy did not improve from 0.81038
Epoch 117/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5946 - binary_accuracy: 0.9977 - val_loss: 0.9723 - val_binary_accuracy: 0.8011

Epoch 00117: val_binary_accuracy did not improve from 0.81038
Epoch 118/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5936 - binary_accuracy: 0.9975 - val_loss: 0.9915 - val_binary_accuracy: 0.7722

Epoch 00118: val_binary_accuracy did not improve from 0.81038
Epoch 119/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.5902 - binary_accuracy: 0.9979 - val_loss: 0.9677 - val_binary_accuracy: 0.7992

Epoch 00119: val_binary_accuracy did not improve from 0.81038
Epoch 120/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5886 - binary_accuracy: 0.9978 - val_loss: 0.9854 - val_binary_accuracy: 0.7769

Epoch 00120: val_binary_accuracy did not improve from 0.81038
Epoch 121/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5858 - binary_accuracy: 0.9982 - val_loss: 1.0627 - val_binary_accuracy: 0.7316

Epoch 00121: val_binary_accuracy did not improve from 0.81038
Epoch 122/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5838 - binary_accuracy: 0.9980 - val_loss: 0.9658 - val_binary_accuracy: 0.8003

Epoch 00122: val_binary_accuracy did not improve from 0.81038
Epoch 123/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5828 - binary_accuracy: 0.9977 - val_loss: 0.9713 - val_binary_accuracy: 0.8051

Epoch 00123: val_binary_accuracy did not improve from 0.81038
Epoch 124/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5791 - binary_accuracy: 0.9987 - val_loss: 0.9640 - val_binary_accuracy: 0.8010

Epoch 00124: val_binary_accuracy did not improve from 0.81038
Epoch 125/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.5765 - binary_accuracy: 0.9991 - val_loss: 0.9898 - val_binary_accuracy: 0.7722

Epoch 00125: val_binary_accuracy did not improve from 0.81038
Epoch 126/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5756 - binary_accuracy: 0.9987 - val_loss: 0.9674 - val_binary_accuracy: 0.8054

Epoch 00126: val_binary_accuracy did not improve from 0.81038
Epoch 127/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5729 - binary_accuracy: 0.9989 - val_loss: 0.9653 - val_binary_accuracy: 0.7969

Epoch 00127: val_binary_accuracy did not improve from 0.81038
Epoch 128/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.5715 - binary_accuracy: 0.9989 - val_loss: 0.9777 - val_binary_accuracy: 0.8164

Epoch 00128: val_binary_accuracy improved from 0.81038 to 0.81636, saving model to weights.best.hdf5
Epoch 129/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5682 - binary_accuracy: 0.9994 - val_loss: 0.9839 - val_binary_accuracy: 0.7745

Epoch 00129: val_binary_accuracy did not improve from 0.81636
Epoch 130/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.5666 - binary_accuracy: 0.9993 - val_loss: 1.0087 - val_binary_accuracy: 0.7618

Epoch 00130: val_binary_accuracy did not improve from 0.81636
Epoch 131/200
94379/94379 [==============================] - 121s 1ms/step - loss: 0.5650 - binary_accuracy: 0.9995 - val_loss: 0.9635 - val_binary_accuracy: 0.8100

Epoch 00131: val_binary_accuracy did not improve from 0.81636
Epoch 132/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5640 - binary_accuracy: 0.9994 - val_loss: 0.9591 - val_binary_accuracy: 0.8060

Epoch 00132: val_binary_accuracy did not improve from 0.81636
Epoch 133/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5610 - binary_accuracy: 0.9996 - val_loss: 1.0277 - val_binary_accuracy: 0.7477

Epoch 00133: val_binary_accuracy did not improve from 0.81636
Epoch 134/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5594 - binary_accuracy: 0.9996 - val_loss: 0.9579 - val_binary_accuracy: 0.8046

Epoch 00134: val_binary_accuracy did not improve from 0.81636
Epoch 135/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.5590 - binary_accuracy: 0.9994 - val_loss: 0.9844 - val_binary_accuracy: 0.7750

Epoch 00135: val_binary_accuracy did not improve from 0.81636
Epoch 136/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5560 - binary_accuracy: 0.9996 - val_loss: 0.9724 - val_binary_accuracy: 0.8169

Epoch 00136: val_binary_accuracy improved from 0.81636 to 0.81691, saving model to weights.best.hdf5
Epoch 137/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.5549 - binary_accuracy: 0.9997 - val_loss: 0.9612 - val_binary_accuracy: 0.7946

Epoch 00137: val_binary_accuracy did not improve from 0.81691
Epoch 138/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5525 - binary_accuracy: 0.9998 - val_loss: 0.9602 - val_binary_accuracy: 0.8035

Epoch 00138: val_binary_accuracy did not improve from 0.81691
Epoch 139/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5508 - binary_accuracy: 0.9998 - val_loss: 0.9718 - val_binary_accuracy: 0.7814

Epoch 00139: val_binary_accuracy did not improve from 0.81691
Epoch 140/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5499 - binary_accuracy: 0.9997 - val_loss: 0.9644 - val_binary_accuracy: 0.8116

Epoch 00140: val_binary_accuracy did not improve from 0.81691
Epoch 141/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5486 - binary_accuracy: 0.9997 - val_loss: 0.9616 - val_binary_accuracy: 0.7905

Epoch 00141: val_binary_accuracy did not improve from 0.81691
Epoch 142/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5463 - binary_accuracy: 0.9999 - val_loss: 0.9686 - val_binary_accuracy: 0.7867

Epoch 00142: val_binary_accuracy did not improve from 0.81691
Epoch 143/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5445 - binary_accuracy: 0.9999 - val_loss: 0.9974 - val_binary_accuracy: 0.7661

Epoch 00143: val_binary_accuracy did not improve from 0.81691
Epoch 144/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5435 - binary_accuracy: 0.9998 - val_loss: 0.9538 - val_binary_accuracy: 0.8050

Epoch 00144: val_binary_accuracy did not improve from 0.81691
Epoch 145/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.5418 - binary_accuracy: 0.9999 - val_loss: 0.9642 - val_binary_accuracy: 0.7870

Epoch 00145: val_binary_accuracy did not improve from 0.81691
Epoch 146/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5406 - binary_accuracy: 0.9999 - val_loss: 0.9716 - val_binary_accuracy: 0.7816

Epoch 00146: val_binary_accuracy did not improve from 0.81691
Epoch 147/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5393 - binary_accuracy: 0.9999 - val_loss: 0.9580 - val_binary_accuracy: 0.7936

Epoch 00147: val_binary_accuracy did not improve from 0.81691
Epoch 148/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5368 - binary_accuracy: 1.0000 - val_loss: 0.9560 - val_binary_accuracy: 0.7934

Epoch 00148: val_binary_accuracy did not improve from 0.81691
Epoch 149/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5358 - binary_accuracy: 1.0000 - val_loss: 0.9582 - val_binary_accuracy: 0.7942

Epoch 00149: val_binary_accuracy did not improve from 0.81691
Epoch 150/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5343 - binary_accuracy: 0.9999 - val_loss: 0.9554 - val_binary_accuracy: 0.7959

Epoch 00150: val_binary_accuracy did not improve from 0.81691
Epoch 151/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5338 - binary_accuracy: 1.0000 - val_loss: 0.9534 - val_binary_accuracy: 0.7971

Epoch 00151: val_binary_accuracy did not improve from 0.81691
Epoch 152/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.5320 - binary_accuracy: 1.0000 - val_loss: 0.9530 - val_binary_accuracy: 0.8122

Epoch 00152: val_binary_accuracy did not improve from 0.81691
Epoch 153/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5302 - binary_accuracy: 0.9999 - val_loss: 0.9511 - val_binary_accuracy: 0.8089

Epoch 00153: val_binary_accuracy did not improve from 0.81691
Epoch 154/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5289 - binary_accuracy: 1.0000 - val_loss: 0.9633 - val_binary_accuracy: 0.7866

Epoch 00154: val_binary_accuracy did not improve from 0.81691
Epoch 155/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5278 - binary_accuracy: 1.0000 - val_loss: 0.9712 - val_binary_accuracy: 0.7803

Epoch 00155: val_binary_accuracy did not improve from 0.81691
Epoch 156/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.5268 - binary_accuracy: 1.0000 - val_loss: 0.9875 - val_binary_accuracy: 0.7682

Epoch 00156: val_binary_accuracy did not improve from 0.81691
Epoch 157/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5252 - binary_accuracy: 1.0000 - val_loss: 1.0307 - val_binary_accuracy: 0.7470

Epoch 00157: val_binary_accuracy did not improve from 0.81691
Epoch 158/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5241 - binary_accuracy: 0.9999 - val_loss: 0.9608 - val_binary_accuracy: 0.7889

Epoch 00158: val_binary_accuracy did not improve from 0.81691
Epoch 159/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5224 - binary_accuracy: 1.0000 - val_loss: 0.9470 - val_binary_accuracy: 0.8114

Epoch 00159: val_binary_accuracy did not improve from 0.81691
Epoch 160/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5214 - binary_accuracy: 1.0000 - val_loss: 0.9501 - val_binary_accuracy: 0.8137

Epoch 00160: val_binary_accuracy did not improve from 0.81691
Epoch 161/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5201 - binary_accuracy: 1.0000 - val_loss: 0.9488 - val_binary_accuracy: 0.8017

Epoch 00161: val_binary_accuracy did not improve from 0.81691
Epoch 162/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5188 - binary_accuracy: 1.0000 - val_loss: 0.9493 - val_binary_accuracy: 0.8002

Epoch 00162: val_binary_accuracy did not improve from 0.81691
Epoch 163/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5177 - binary_accuracy: 1.0000 - val_loss: 0.9432 - val_binary_accuracy: 0.8066

Epoch 00163: val_binary_accuracy did not improve from 0.81691
Epoch 164/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.5165 - binary_accuracy: 1.0000 - val_loss: 0.9544 - val_binary_accuracy: 0.7917

Epoch 00164: val_binary_accuracy did not improve from 0.81691
Epoch 165/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5155 - binary_accuracy: 1.0000 - val_loss: 1.0495 - val_binary_accuracy: 0.7393

Epoch 00165: val_binary_accuracy did not improve from 0.81691
Epoch 166/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5150 - binary_accuracy: 1.0000 - val_loss: 0.9549 - val_binary_accuracy: 0.8195

Epoch 00166: val_binary_accuracy improved from 0.81691 to 0.81945, saving model to weights.best.hdf5
Epoch 167/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.5131 - binary_accuracy: 1.0000 - val_loss: 0.9602 - val_binary_accuracy: 0.7855

Epoch 00167: val_binary_accuracy did not improve from 0.81945
Epoch 168/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5120 - binary_accuracy: 1.0000 - val_loss: 0.9494 - val_binary_accuracy: 0.7952

Epoch 00168: val_binary_accuracy did not improve from 0.81945
Epoch 169/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.5106 - binary_accuracy: 1.0000 - val_loss: 0.9722 - val_binary_accuracy: 0.7773

Epoch 00169: val_binary_accuracy did not improve from 0.81945
Epoch 170/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5099 - binary_accuracy: 1.0000 - val_loss: 0.9690 - val_binary_accuracy: 0.7788

Epoch 00170: val_binary_accuracy did not improve from 0.81945
Epoch 171/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5087 - binary_accuracy: 1.0000 - val_loss: 0.9475 - val_binary_accuracy: 0.7969

Epoch 00171: val_binary_accuracy did not improve from 0.81945
Epoch 172/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5072 - binary_accuracy: 1.0000 - val_loss: 0.9462 - val_binary_accuracy: 0.8150

Epoch 00172: val_binary_accuracy did not improve from 0.81945
Epoch 173/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5065 - binary_accuracy: 1.0000 - val_loss: 0.9404 - val_binary_accuracy: 0.8075

Epoch 00173: val_binary_accuracy did not improve from 0.81945
Epoch 174/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.5054 - binary_accuracy: 1.0000 - val_loss: 0.9424 - val_binary_accuracy: 0.8003

Epoch 00174: val_binary_accuracy did not improve from 0.81945
Epoch 175/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.5042 - binary_accuracy: 1.0000 - val_loss: 0.9429 - val_binary_accuracy: 0.8157

Epoch 00175: val_binary_accuracy did not improve from 0.81945
Epoch 176/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.5034 - binary_accuracy: 1.0000 - val_loss: 0.9420 - val_binary_accuracy: 0.8038

Epoch 00176: val_binary_accuracy did not improve from 0.81945
Epoch 177/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5022 - binary_accuracy: 1.0000 - val_loss: 0.9407 - val_binary_accuracy: 0.8126

Epoch 00177: val_binary_accuracy did not improve from 0.81945
Epoch 178/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.5011 - binary_accuracy: 1.0000 - val_loss: 0.9621 - val_binary_accuracy: 0.7830

Epoch 00178: val_binary_accuracy did not improve from 0.81945
Epoch 179/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.5005 - binary_accuracy: 1.0000 - val_loss: 0.9423 - val_binary_accuracy: 0.8180

Epoch 00179: val_binary_accuracy did not improve from 0.81945
Epoch 180/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.4991 - binary_accuracy: 1.0000 - val_loss: 0.9471 - val_binary_accuracy: 0.7932

Epoch 00180: val_binary_accuracy did not improve from 0.81945
Epoch 181/200
94379/94379 [==============================] - 116s 1ms/step - loss: 0.4981 - binary_accuracy: 1.0000 - val_loss: 0.9465 - val_binary_accuracy: 0.7922

Epoch 00181: val_binary_accuracy did not improve from 0.81945
Epoch 182/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.4970 - binary_accuracy: 1.0000 - val_loss: 0.9495 - val_binary_accuracy: 0.7875

Epoch 00182: val_binary_accuracy did not improve from 0.81945
Epoch 183/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.4965 - binary_accuracy: 1.0000 - val_loss: 0.9462 - val_binary_accuracy: 0.7917

Epoch 00183: val_binary_accuracy did not improve from 0.81945
Epoch 184/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.4950 - binary_accuracy: 1.0000 - val_loss: 0.9383 - val_binary_accuracy: 0.8008

Epoch 00184: val_binary_accuracy did not improve from 0.81945
Epoch 185/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4940 - binary_accuracy: 1.0000 - val_loss: 0.9612 - val_binary_accuracy: 0.7797

Epoch 00185: val_binary_accuracy did not improve from 0.81945
Epoch 186/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4932 - binary_accuracy: 1.0000 - val_loss: 0.9524 - val_binary_accuracy: 0.7872

Epoch 00186: val_binary_accuracy did not improve from 0.81945
Epoch 187/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.4924 - binary_accuracy: 1.0000 - val_loss: 0.9352 - val_binary_accuracy: 0.8118

Epoch 00187: val_binary_accuracy did not improve from 0.81945
Epoch 188/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4913 - binary_accuracy: 1.0000 - val_loss: 0.9466 - val_binary_accuracy: 0.8195

Epoch 00188: val_binary_accuracy improved from 0.81945 to 0.81950, saving model to weights.best.hdf5
Epoch 189/200
94379/94379 [==============================] - 115s 1ms/step - loss: 0.4903 - binary_accuracy: 1.0000 - val_loss: 0.9398 - val_binary_accuracy: 0.7951

Epoch 00189: val_binary_accuracy did not improve from 0.81950
Epoch 190/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.4893 - binary_accuracy: 1.0000 - val_loss: 0.9556 - val_binary_accuracy: 0.7834

Epoch 00190: val_binary_accuracy did not improve from 0.81950
Epoch 191/200
94379/94379 [==============================] - 113s 1ms/step - loss: 0.4882 - binary_accuracy: 1.0000 - val_loss: 0.9376 - val_binary_accuracy: 0.7977

Epoch 00191: val_binary_accuracy did not improve from 0.81950
Epoch 192/200
94379/94379 [==============================] - 112s 1ms/step - loss: 0.4875 - binary_accuracy: 1.0000 - val_loss: 0.9498 - val_binary_accuracy: 0.7874

Epoch 00192: val_binary_accuracy did not improve from 0.81950
Epoch 193/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4864 - binary_accuracy: 1.0000 - val_loss: 0.9357 - val_binary_accuracy: 0.8108

Epoch 00193: val_binary_accuracy did not improve from 0.81950
Epoch 194/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4856 - binary_accuracy: 1.0000 - val_loss: 0.9330 - val_binary_accuracy: 0.8022

Epoch 00194: val_binary_accuracy did not improve from 0.81950
Epoch 195/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4844 - binary_accuracy: 1.0000 - val_loss: 0.9336 - val_binary_accuracy: 0.7998

Epoch 00195: val_binary_accuracy did not improve from 0.81950
Epoch 196/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4837 - binary_accuracy: 1.0000 - val_loss: 0.9321 - val_binary_accuracy: 0.8051

Epoch 00196: val_binary_accuracy did not improve from 0.81950
Epoch 197/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4829 - binary_accuracy: 1.0000 - val_loss: 0.9291 - val_binary_accuracy: 0.8058

Epoch 00197: val_binary_accuracy did not improve from 0.81950
Epoch 198/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4818 - binary_accuracy: 1.0000 - val_loss: 0.9299 - val_binary_accuracy: 0.8055

Epoch 00198: val_binary_accuracy did not improve from 0.81950
Epoch 199/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4809 - binary_accuracy: 1.0000 - val_loss: 0.9324 - val_binary_accuracy: 0.7996

Epoch 00199: val_binary_accuracy did not improve from 0.81950
Epoch 200/200
94379/94379 [==============================] - 114s 1ms/step - loss: 0.4799 - binary_accuracy: 1.0000 - val_loss: 0.9304 - val_binary_accuracy: 0.8030

Epoch 00200: val_binary_accuracy did not improve from 0.81950
In [47]:
import matplotlib.pyplot as plt

plt.figure(figsize=(20,15))
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model Loss', fontsize = 20)
plt.ylabel('Loss', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

plt.figure(figsize=(20,15))
plt.plot(history.history['binary_accuracy'])
plt.plot(history.history['val_binary_accuracy'])
plt.title('Model Accuracy', fontsize = 20)
plt.ylabel('Accuracy', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()
In [48]:
model.load_weights("weights.best.hdf5")
model.compile(loss = 'binary_crossentropy', optimizer = sgd, metrics = ['binary_accuracy'])
In [49]:
from sklearn.metrics import confusion_matrix
import itertools

plt.figure(figsize=(15,10))

predicted_labels = model.predict(X_test)
cm = confusion_matrix(y_test, [np.round(i[0]) for i in predicted_labels])
print('Confusion matrix:\n',cm)

cm = cm.astype('float') / cm.sum(axis = 1)[:, np.newaxis]

plt.imshow(cm, cmap = plt.cm.Blues)
plt.title('Normalized confusion matrix', fontsize = 20)
plt.colorbar()
plt.xlabel('True label', fontsize = 20)
plt.ylabel('Predicted label', fontsize = 20)
for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
    plt.text(j, i, format(cm[i, j], '.2f'),
             horizontalalignment = 'center', verticalalignment = 'center', fontsize = 20,
             color='white' if cm[i, j] > 0.5 else 'black')
plt.show()
Confusion matrix:
 [[13232  1681]
 [ 3580 11001]]
In [50]:
scores = model.evaluate(X_test, y_test, verbose = 0)
print("Accuracy: %.2f%%" % (scores[1]*100))
Accuracy: 82.16%

We reached quite a good accuracy of classification, however still worse than the Random Forest model on the same Bag Of Words (frequencies of K-mers) data set. Let us see whether more advanced Word Embeddings models can improve the accuracy of Neanderthal introgressed vs. depleted sequence / sentence / text classification.

Multilayer Perceptron + Embedding Layer for Neanderthal Introgressed vs. Depleted Sequence Classification

Here instead of the Bag of Words model we will implement a Multilayer Perceptron model with an Embedding Layer which is supposed to learn the Word Embeddings, i.e. finding correspondence between words with similar meaning. This is a model where the order of the words will be taken into account for classification of sequences coming from Neanderthal introgressed vs. depleted Neanderthal ancestry regions. We again start with reading the sequences from the two fasta-files (introgressed and depleted regions), split them into words / K-mers and build sentences out of them.

In [1]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    cutoff = 4000
    my_intr_seq = str(intr.seq)[0:cutoff]
    my_depl_seq = str(depl.seq)[0:cutoff]
    
    intr_seqs.append(my_intr_seq)
    depl_seqs.append(my_depl_seq)

    e = e + 1
    if e%20000 == 0:
        print('Finished ' + str(e) + ' entries')
        
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]

kmer = 5
intr_texts = [' '.join(getKmers(i, kmer)) for i in intr_seqs]
depl_texts = [' '.join(getKmers(i, kmer)) for i in depl_seqs]
Finished 20000 entries
Finished 40000 entries
Finished 60000 entries

Hhere we will use the Tokenizer class from Kears in order to convert words / K-mers into integers.

In [2]:
merge_texts = intr_texts + depl_texts
len(merge_texts)
Out[2]:
147468
In [3]:
import numpy as np
labels = list(np.ones(len(intr_texts))) + list(np.zeros(len(depl_texts)))
print(len(labels))
147468
In [4]:
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.feature_extraction.text import TfidfTransformer

import warnings
warnings.filterwarnings('ignore')

#cv = CountVectorizer()
#X = cv.fit_transform(merge_texts)

#tfidf_transformer = TfidfTransformer()
#X = tfidf_transformer.fit_transform(X)

tokenizer = Tokenizer()
tokenizer.fit_on_texts(merge_texts)

#X = tokenizer.texts_to_matrix(merge_texts, mode = 'freq')

encoded_docs = tokenizer.texts_to_sequences(merge_texts)
max_length = max([len(s.split()) for s in merge_texts])
X = pad_sequences(encoded_docs, maxlen = max_length, padding = 'post')

#X = X.toarray()

print(X)
print('\n')
print(X.shape)
Using TensorFlow backend.
[[ 76 633 356 ... 320 548 845]
 [578 167 569 ... 535 613 144]
 [578 167 569 ... 535 613 144]
 ...
 [346 188 516 ... 671  61  25]
 [856 987 997 ... 633 435 648]
 [ 48 409 456 ...  29  55  39]]


(147468, 3996)

Again, we split the data set into training, X_train, and testing, X_test, subsets:

In [5]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, labels, test_size = 0.20, random_state = 42)
In [6]:
print(X_train.shape)
print(y_train[0:10])
print(X_test.shape)
print(y_test[0:10])
(117974, 3996)
[0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0]
(29494, 3996)
[1.0, 0.0, 1.0, 1.0, 1.0, 0.0, 1.0, 1.0, 1.0, 0.0]
In [7]:
print(max_length)
3996
In [8]:
vocab_size = len(tokenizer.word_index) + 1
print(vocab_size)
1025

Now the data is prepared to be fed into the Multilayer Perceptron model for classification of sequences coming from Neandertal introgressed vs. depleted Neanderthal ancestry regions.

In [11]:
from keras.models import Sequential
from keras.regularizers import l2, l1
from keras.callbacks import ModelCheckpoint
from keras.optimizers import SGD, Adam, Adadelta
from keras.layers import Conv1D, Dense, MaxPooling1D, Flatten, Dropout, Embedding, GlobalAveragePooling1D

model = Sequential()
model.add(Embedding(vocab_size, 100, input_length = max_length)) #W_regularizer = l1(0.000001)
#model.add(Conv1D(filters = 16, kernel_size = 5, padding = 'same', activation = 'relu'))
#model.add(MaxPooling1D(pool_size = 2))
#model.add(GlobalAveragePooling1D())
model.add(Flatten())
model.add(Dense(10, activation = 'sigmoid')) #kernel_regularizer = l1(0.00004)
#model.add(Dropout(0.5))
#model.add(Dense(10, activation = 'sigmoid'))
#model.add(Dropout(0.5))
model.add(Dense(1, activation = 'sigmoid'))

epochs = 20
#lrate = 0.0001
#decay = lrate / epochs
#sgd = SGD(lr = lrate, momentum = 0.9, nesterov = False)
#sgd = SGD(lr = lrate, momentum = 0.9, decay = decay, nesterov = False)
#model.compile(loss = 'binary_crossentropy', optimizer = Adam(lr = lrate), metrics = ['accuracy'])
#model.compile(loss = 'binary_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
#model.compile(loss = 'binary_crossentropy', optimizer = sgd, metrics = ['accuracy'])
model.compile(loss = 'binary_crossentropy', optimizer = 'SGD', metrics = ['accuracy'])
checkpoint = ModelCheckpoint("weights.best.hdf5", monitor = 'val_acc', verbose = 1, 
                             save_best_only = True, mode = 'max')
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_2 (Embedding)      (None, 3996, 100)         102500    
_________________________________________________________________
flatten_2 (Flatten)          (None, 399600)            0         
_________________________________________________________________
dense_3 (Dense)              (None, 10)                3996010   
_________________________________________________________________
dense_4 (Dense)              (None, 1)                 11        
=================================================================
Total params: 4,098,521
Trainable params: 4,098,521
Non-trainable params: 0
_________________________________________________________________
In [12]:
import warnings
warnings.filterwarnings('ignore')

history = model.fit(X_train, y_train, 
                    epochs = epochs, verbose = 1, validation_split = 0.2, batch_size = 32, shuffle = True, 
                    callbacks = [checkpoint])
Train on 94379 samples, validate on 23595 samples
Epoch 1/20
94379/94379 [==============================] - 288s 3ms/step - loss: 0.6920 - acc: 0.5306 - val_loss: 0.6900 - val_acc: 0.5848

Epoch 00001: val_acc improved from -inf to 0.58478, saving model to weights.best.hdf5
Epoch 2/20
94379/94379 [==============================] - 301s 3ms/step - loss: 0.6845 - acc: 0.6789 - val_loss: 0.6860 - val_acc: 0.6557

Epoch 00002: val_acc improved from 0.58478 to 0.65569, saving model to weights.best.hdf5
Epoch 3/20
94379/94379 [==============================] - 298s 3ms/step - loss: 0.6746 - acc: 0.7963 - val_loss: 0.6797 - val_acc: 0.6869

Epoch 00003: val_acc improved from 0.65569 to 0.68688, saving model to weights.best.hdf5
Epoch 4/20
94379/94379 [==============================] - 303s 3ms/step - loss: 0.6586 - acc: 0.8511 - val_loss: 0.6695 - val_acc: 0.6896

Epoch 00004: val_acc improved from 0.68688 to 0.68960, saving model to weights.best.hdf5
Epoch 5/20
94379/94379 [==============================] - 305s 3ms/step - loss: 0.6323 - acc: 0.8768 - val_loss: 0.6533 - val_acc: 0.7002

Epoch 00005: val_acc improved from 0.68960 to 0.70023, saving model to weights.best.hdf5
Epoch 6/20
94379/94379 [==============================] - 955s 10ms/step - loss: 0.5905 - acc: 0.8894 - val_loss: 0.6293 - val_acc: 0.7084

Epoch 00006: val_acc improved from 0.70023 to 0.70837, saving model to weights.best.hdf5
Epoch 7/20
94379/94379 [==============================] - 283s 3ms/step - loss: 0.5298 - acc: 0.9021 - val_loss: 0.5997 - val_acc: 0.7175

Epoch 00007: val_acc improved from 0.70837 to 0.71752, saving model to weights.best.hdf5
Epoch 8/20
94379/94379 [==============================] - 1554s 16ms/step - loss: 0.4522 - acc: 0.9200 - val_loss: 0.5695 - val_acc: 0.7130

Epoch 00008: val_acc did not improve from 0.71752
Epoch 9/20
94379/94379 [==============================] - 279s 3ms/step - loss: 0.3658 - acc: 0.9428 - val_loss: 0.5454 - val_acc: 0.7206

Epoch 00009: val_acc improved from 0.71752 to 0.72062, saving model to weights.best.hdf5
Epoch 10/20
94379/94379 [==============================] - 283s 3ms/step - loss: 0.2813 - acc: 0.9650 - val_loss: 0.5329 - val_acc: 0.7171

Epoch 00010: val_acc did not improve from 0.72062
Epoch 11/20
94379/94379 [==============================] - 293s 3ms/step - loss: 0.2073 - acc: 0.9829 - val_loss: 0.5320 - val_acc: 0.7149

Epoch 00011: val_acc did not improve from 0.72062
Epoch 12/20
94379/94379 [==============================] - 292s 3ms/step - loss: 0.1484 - acc: 0.9947 - val_loss: 0.5397 - val_acc: 0.7125

Epoch 00012: val_acc did not improve from 0.72062
Epoch 13/20
94379/94379 [==============================] - 290s 3ms/step - loss: 0.1052 - acc: 0.9992 - val_loss: 0.5525 - val_acc: 0.7136

Epoch 00013: val_acc did not improve from 0.72062
Epoch 14/20
94379/94379 [==============================] - 288s 3ms/step - loss: 0.0756 - acc: 0.9999 - val_loss: 0.5669 - val_acc: 0.7156

Epoch 00014: val_acc did not improve from 0.72062
Epoch 15/20
94379/94379 [==============================] - 289s 3ms/step - loss: 0.0559 - acc: 1.0000 - val_loss: 0.5836 - val_acc: 0.7150

Epoch 00015: val_acc did not improve from 0.72062
Epoch 16/20
94379/94379 [==============================] - 291s 3ms/step - loss: 0.0428 - acc: 1.0000 - val_loss: 0.6001 - val_acc: 0.7139

Epoch 00016: val_acc did not improve from 0.72062
Epoch 17/20
94379/94379 [==============================] - 294s 3ms/step - loss: 0.0338 - acc: 1.0000 - val_loss: 0.6154 - val_acc: 0.7139

Epoch 00017: val_acc did not improve from 0.72062
Epoch 18/20
94379/94379 [==============================] - 295s 3ms/step - loss: 0.0275 - acc: 1.0000 - val_loss: 0.6303 - val_acc: 0.7129

Epoch 00018: val_acc did not improve from 0.72062
Epoch 19/20
94379/94379 [==============================] - 297s 3ms/step - loss: 0.0229 - acc: 1.0000 - val_loss: 0.6440 - val_acc: 0.7125

Epoch 00019: val_acc did not improve from 0.72062
Epoch 20/20
94379/94379 [==============================] - 297s 3ms/step - loss: 0.0194 - acc: 1.0000 - val_loss: 0.6563 - val_acc: 0.7130

Epoch 00020: val_acc did not improve from 0.72062
In [13]:
model.load_weights("weights.best.hdf5")
model.compile(loss = 'binary_crossentropy', optimizer = 'SGD', metrics = ['accuracy'])
In [15]:
import matplotlib.pyplot as plt

plt.figure(figsize = (20,15))
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model Loss', fontsize = 20)
plt.ylabel('Loss', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

plt.figure(figsize=(20,15))
plt.plot(history.history['acc'])
plt.plot(history.history['val_acc'])
plt.title('Model Accuracy', fontsize = 20)
plt.ylabel('Accuracy', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

We can see that the model clearly overfits but still reaches quite high accuracies on the validation data set. Some regularization with Dropout or L1 / L2 would be good to test here, there is a potential for improvements.

In [16]:
from sklearn.metrics import confusion_matrix
import itertools

plt.figure(figsize=(15,10))

predicted_labels = model.predict(X_test)
cm = confusion_matrix(y_test, [np.round(i[0]) for i in predicted_labels])
print('Confusion matrix:\n',cm)

cm = cm.astype('float') / cm.sum(axis = 1)[:, np.newaxis]

plt.imshow(cm, cmap = plt.cm.Blues)
plt.title('Normalized confusion matrix', fontsize = 20)
plt.colorbar()
plt.xlabel('True label', fontsize = 20)
plt.ylabel('Predicted label', fontsize = 20)
for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
    plt.text(j, i, format(cm[i, j], '.2f'),
             horizontalalignment = 'center', verticalalignment = 'center', fontsize = 20,
             color='white' if cm[i, j] > 0.5 else 'black')
plt.show()
Confusion matrix:
 [[11816  3097]
 [ 4986  9595]]
In [17]:
scores = model.evaluate(X_test, y_test, verbose=0)
print("Accuracy: %.2f%%" % (scores[1]*100))
Accuracy: 72.59%

We conclude again that the Embedding Layer did not improve the classification accuracy compared to the Bag of Words model. It could not outperform the Random Forest model either.

AutoML Approach with Autokeras

Apparently the Autokeras in its current form is super memory-hungry. I did not manage to fit even a moderate IMDB data set into my memory. So in theory AutoML is a very cool concept, however currently too imatture to be used for real-world applications.

In [1]:
import numpy as np
import tensorflow as tf
import autokeras as ak

index_offset = 3  # word index offset
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.imdb.load_data(num_words = 20,maxlen = 180,
                                                                        index_from = index_offset)
Better speed can be achieved with apex installed from https://www.github.com/nvidia/apex.
In [2]:
x_train[0][0:10]
Out[2]:
[1, 14, 2, 8, 2, 2, 7, 4, 2, 2]
In [3]:
x_test[0][0:10]
Out[3]:
[1, 13, 16, 2, 8, 2, 6, 2, 7, 14]
In [4]:
x_train = x_train
y_train = y_train.reshape(-1, 1)
x_test = x_test
y_test = y_test.reshape(-1, 1)
y_train.shape
Out[4]:
(25000, 1)
In [5]:
word_to_id = tf.keras.datasets.imdb.get_word_index()
word_to_id = {k: (v + index_offset) for k, v in word_to_id.items()}
word_to_id["<PAD>"] = 0
word_to_id["<START>"] = 1
word_to_id["<UNK>"] = 2
In [6]:
id_to_word = {value: key for key, value in word_to_id.items()}
In [7]:
x_train = list(map(lambda sentence: ' '.join(id_to_word[i] for i in sentence), x_train))
x_test = list(map(lambda sentence: ' '.join(id_to_word[i] for i in sentence), x_test))
x_train = np.array(x_train, dtype=np.str)
x_test = np.array(x_test, dtype=np.str)
In [8]:
x_train.shape
Out[8]:
(25000,)
In [9]:
x_train[0]
Out[9]:
'<START> this <UNK> to <UNK> <UNK> of the <UNK> <UNK> of the <UNK> <UNK> <UNK> <UNK> i <UNK> <UNK> this <UNK> <UNK> the <UNK> <UNK> it was <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> the <UNK> <UNK> <UNK> <UNK> with <UNK> <UNK> <UNK> the <UNK> <UNK> <UNK> <UNK> it <UNK> was the <UNK> of the <UNK> <UNK> <UNK> in the <UNK> <UNK> <UNK> <UNK> to <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> that <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> to <UNK> this <UNK> <UNK> for a <UNK> it <UNK> <UNK> <UNK> <UNK> a <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> it <UNK> <UNK> <UNK> <UNK> in the <UNK> <UNK> a <UNK> <UNK> is <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> i <UNK> <UNK> <UNK> <UNK> <UNK> this is to <UNK> <UNK> <UNK> <UNK> <UNK> a <UNK> of <UNK> <UNK>'
In [10]:
y_train.shape
Out[10]:
(25000, 1)
In [11]:
y_train
Out[11]:
array([[0],
       [0],
       [0],
       ...,
       [0],
       [1],
       [0]])
In [12]:
x_test[0]
Out[12]:
'<START> i was <UNK> to <UNK> a <UNK> of this <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> the <UNK> <UNK> <UNK> in <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> a <UNK> <UNK> <UNK> as <UNK> <UNK> <UNK> of the <UNK> <UNK> <UNK> <UNK> and <UNK> <UNK> <UNK> <UNK> <UNK> the <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> and a <UNK> <UNK> <UNK> <UNK> <UNK> and <UNK> <UNK> <UNK> as the <UNK> <UNK> and in <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> <UNK> to <UNK> a <UNK> with a <UNK> <UNK> i <UNK> this <UNK> <UNK> <UNK> <UNK> <UNK> a <UNK> <UNK> <UNK> <UNK> to <UNK> <UNK> <UNK> as this <UNK> <UNK> <UNK> <UNK> <UNK> it <UNK> <UNK> <UNK> it'
In [ ]:
clf = ak.TextClassifier(verbose=True)
clf.fit(x_train, y_train, time_limit = 60)
#clf.final_fit(x_train, y_train, x_test, y_test, retrain=True)
Epoch:   0%|          | 0/4 [00:00<?, ?it/s]
Iteration:   0%|          | 0/782 [00:00<?, ?it/s]
***** Running training *****
Num examples = %d 25000
Batch size = %d 32
Num steps = %d 3125
Iteration:   0%|          | 1/782 [00:25<5:29:36, 25.32s/it]
In [ ]:
clf.evaluate(x_test, y_test)

Agains same problem. I could not run the TextClassifier either on the benchmark IMDB data or my Neanderthal introgressed vs. depleted texts. Need to wait for better releases of TextClassifier. Still there is not guarantee that AutoML will beat my manual tuning of the Neural Networks, based on the analysis from this blog https://www.pyimagesearch.com/2019/01/07/auto-keras-and-automl-a-getting-started-guide/.

In [8]:
import autokeras as ak
from autokeras import TextClassifier
Better speed can be achieved with apex installed from https://www.github.com/nvidia/apex.
In [16]:
classifier = TextClassifier(verbose = True)
classifier.fit(X_train, y_train, time_limit = 60 * 60)
classifier.final_fit(X_train, y_train, X_test, y_test, retrain = True)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-16-304d5aad69eb> in <module>
      1 classifier = TextClassifier(verbose = True)
----> 2 classifier.fit(X_train, y_train, time_limit = 60 * 60)
      3 classifier.final_fit(X_train, y_train, X_test, y_test, retrain = True)

/usr/local/lib/python3.6/dist-packages/autokeras/text/text_supervised.py in fit(self, x, y, time_limit)
     62         """
     63         if not self.num_labels:
---> 64             self.num_labels = len(y[-1])
     65 
     66         # Prepare model

TypeError: object of type 'int' has no len()
In [15]:
y_train = [int(i) for i in y_train]
In [ ]:
scores = classifier.evaluate(X_test, y_test, verbose=0)
print("Accuracy: %.2f%%" % (scores[1]*100))

CNN + Embedding Layer for Neanderthal Introgressed vs. Depleted Sequence Classification

Bag of Words is a good model that compares K-mer frequencies between Neanderthal introgressed vs. depleted regions, it achieves quite a high accuracy of sequence classification. However, it does not take connections between the words / K-mers into account. We can do a more advanced classification using Convolutional Neural Networks (CNNs) with special Embedding Layer in order to incorporate more memory into our model, so that the words / K-mers remember their order in the sentence / sequence. The Embedding layer provides a way to learn word embeddings / numeric representations while performing the classification task. We again start with reading the sequences from the two fasta-files (introgressed and depleted regions), split them into words / K-mers and build sentences out of them.

In [1]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    cutoff = 4000
    my_intr_seq = str(intr.seq)[0:cutoff]
    my_depl_seq = str(depl.seq)[0:cutoff]
    
    intr_seqs.append(my_intr_seq)
    depl_seqs.append(my_depl_seq)

    e = e + 1
    if e%20000 == 0:
        print('Finished ' + str(e) + ' entries')
        
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]

kmer = 5
intr_texts = [' '.join(getKmers(i, kmer)) for i in intr_seqs]
depl_texts = [' '.join(getKmers(i, kmer)) for i in depl_seqs]
Finished 20000 entries
Finished 40000 entries
Finished 60000 entries
In [2]:
merge_texts = intr_texts + depl_texts
len(merge_texts)
Out[2]:
147468
In [3]:
import numpy as np
labels = list(np.ones(len(intr_texts))) + list(np.zeros(len(depl_texts)))
print(len(labels))
147468

Now we are going to convert the Neanderthal introgressed and depleted texts into integers to be input into the Kears Embedding Layer. After this all the unique words will be reprsented by an integer. For this we are using fit_on_texts function from the Tokenizer class of Keras. The Keras Embedding layer requires all individual documents to be of same length. Hence we wil pad the shorter documents with 0 for now. Therefore now in Keras Embedding layer the input_length will be equal to the length (i.e. number of words) of the document with maximum length or maximum number of words. To pad the shorter documents I am using pad_sequences functon from the Keras library.

In [4]:
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.feature_extraction.text import TfidfTransformer

import warnings
warnings.filterwarnings('ignore')

#cv = CountVectorizer()
#X = cv.fit_transform(merge_texts)

#tfidf_transformer = TfidfTransformer()
#X = tfidf_transformer.fit_transform(X)

tokenizer = Tokenizer()
tokenizer.fit_on_texts(merge_texts)

#X = tokenizer.texts_to_matrix(merge_texts, mode = 'freq')

encoded_docs = tokenizer.texts_to_sequences(merge_texts)
max_length = max([len(s.split()) for s in merge_texts])
X = pad_sequences(encoded_docs, maxlen = max_length, padding = 'post')

#X = np.int32(X.toarray())

print(X)
print('\n')
print(X.shape)
Using TensorFlow backend.
[[ 76 633 356 ... 320 548 845]
 [578 167 569 ... 535 613 144]
 [578 167 569 ... 535 613 144]
 ...
 [346 188 516 ... 671  61  25]
 [856 987 997 ... 633 435 648]
 [ 48 409 456 ...  29  55  39]]


(147468, 3996)
In [6]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, labels, test_size = 0.20, random_state = 42)
print('Train input data dimensions:')
print(X_train.shape)
print(len(y_train))
print('\n')
print('Test input data dimensions:')
print(X_test.shape)
print(len(y_test))
Train input data dimensions:
(117974, 3996)
117974


Test input data dimensions:
(29494, 3996)
29494
In [7]:
print("The maximum number of words in any document: ", max_length)
The maximum number of words in any document:  3996
In [8]:
vocab_size = len(tokenizer.word_index) + 1
print('Vocabulary size, i.e. number of unique words in the text: ', vocab_size)
Vocabulary size, i.e. number of unique words in the text:  1025

Now all the documents are of same length (after padding). And so now we are ready to create and use the embeddings. Here I will design the Convolutional Neural Network (CNN) with an Embedding Layer. I will embed the words into vectors of 100 dimensions.

In [25]:
from keras.models import Sequential
from keras.callbacks import ModelCheckpoint
from keras.optimizers import SGD, Adam, Adadelta
from keras.layers import Conv1D, Dense, MaxPooling1D, Flatten, Dropout, Embedding, Activation

import warnings
warnings.filterwarnings('ignore')

model = Sequential()
model.add(Embedding(vocab_size, 10, input_length = max_length))
#model.add(Dropout(0.5))
model.add(Conv1D(filters = 16, kernel_size = 5, activation = 'relu'))
model.add(MaxPooling1D(pool_size = 2))

model.add(Flatten())
#model.add(Dense(8, activation = 'relu'))
#model.add(Dropout(0.5))
model.add(Dense(1, activation = 'sigmoid'))

epochs = 50
lrate = 0.0001
decay = lrate / epochs
#sgd = SGD(lr = lrate, momentum = 0.99, nesterov = True)
sgd = SGD(lr = lrate, momentum = 0.9, nesterov = False)
#sgd = SGD(lr = lrate, momentum = 0.9, decay = decay, nesterov = False)
#model.compile(loss='binary_crossentropy', optimizer=Adam(lr = lrate), metrics=['binary_accuracy'])
#model.compile(loss = 'binary_crossentropy', optimizer = 'adam', metrics = ['binary_accuracy'])
model.compile(loss = 'binary_crossentropy', optimizer = sgd, metrics = ['binary_accuracy'])
checkpoint = ModelCheckpoint("weights.best.hdf5", monitor = 'val_binary_accuracy', verbose = 1, 
                             save_best_only = True, mode = 'max')
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_10 (Embedding)     (None, 3996, 10)          10250     
_________________________________________________________________
conv1d_10 (Conv1D)           (None, 3992, 16)          816       
_________________________________________________________________
max_pooling1d_10 (MaxPooling (None, 1996, 16)          0         
_________________________________________________________________
flatten_10 (Flatten)         (None, 31936)             0         
_________________________________________________________________
dense_17 (Dense)             (None, 1)                 31937     
=================================================================
Total params: 43,003
Trainable params: 43,003
Non-trainable params: 0
_________________________________________________________________
In [26]:
import warnings
warnings.filterwarnings('ignore')

history = model.fit(X_train, y_train, 
                    epochs = epochs, verbose = 1, validation_split = 0.2, batch_size = 32, shuffle = True, 
                    callbacks = [checkpoint])
Train on 94379 samples, validate on 23595 samples
Epoch 1/50
94379/94379 [==============================] - 209s 2ms/step - loss: 0.6932 - binary_accuracy: 0.5005 - val_loss: 0.6931 - val_binary_accuracy: 0.5014

Epoch 00001: val_binary_accuracy improved from -inf to 0.50142, saving model to weights.best.hdf5
Epoch 2/50
94379/94379 [==============================] - 216s 2ms/step - loss: 0.6931 - binary_accuracy: 0.5041 - val_loss: 0.6931 - val_binary_accuracy: 0.5055

Epoch 00002: val_binary_accuracy improved from 0.50142 to 0.50549, saving model to weights.best.hdf5
Epoch 3/50
94379/94379 [==============================] - 215s 2ms/step - loss: 0.6929 - binary_accuracy: 0.5093 - val_loss: 0.6930 - val_binary_accuracy: 0.5076

Epoch 00003: val_binary_accuracy improved from 0.50549 to 0.50765, saving model to weights.best.hdf5
Epoch 4/50
94379/94379 [==============================] - 218s 2ms/step - loss: 0.6927 - binary_accuracy: 0.5150 - val_loss: 0.6929 - val_binary_accuracy: 0.5086

Epoch 00004: val_binary_accuracy improved from 0.50765 to 0.50858, saving model to weights.best.hdf5
Epoch 5/50
94379/94379 [==============================] - 217s 2ms/step - loss: 0.6926 - binary_accuracy: 0.5180 - val_loss: 0.6928 - val_binary_accuracy: 0.5122

Epoch 00005: val_binary_accuracy improved from 0.50858 to 0.51218, saving model to weights.best.hdf5
Epoch 6/50
94379/94379 [==============================] - 221s 2ms/step - loss: 0.6924 - binary_accuracy: 0.5239 - val_loss: 0.6928 - val_binary_accuracy: 0.5095

Epoch 00006: val_binary_accuracy did not improve from 0.51218
Epoch 7/50
94379/94379 [==============================] - 221s 2ms/step - loss: 0.6922 - binary_accuracy: 0.5260 - val_loss: 0.6927 - val_binary_accuracy: 0.5133

Epoch 00007: val_binary_accuracy improved from 0.51218 to 0.51333, saving model to weights.best.hdf5
Epoch 8/50
94379/94379 [==============================] - 223s 2ms/step - loss: 0.6920 - binary_accuracy: 0.5291 - val_loss: 0.6926 - val_binary_accuracy: 0.5165

Epoch 00008: val_binary_accuracy improved from 0.51333 to 0.51647, saving model to weights.best.hdf5
Epoch 9/50
94379/94379 [==============================] - 220s 2ms/step - loss: 0.6918 - binary_accuracy: 0.5347 - val_loss: 0.6926 - val_binary_accuracy: 0.5118

Epoch 00009: val_binary_accuracy did not improve from 0.51647
Epoch 10/50
94379/94379 [==============================] - 220s 2ms/step - loss: 0.6916 - binary_accuracy: 0.5395 - val_loss: 0.6923 - val_binary_accuracy: 0.5257

Epoch 00010: val_binary_accuracy improved from 0.51647 to 0.52570, saving model to weights.best.hdf5
Epoch 11/50
94379/94379 [==============================] - 224s 2ms/step - loss: 0.6914 - binary_accuracy: 0.5428 - val_loss: 0.6922 - val_binary_accuracy: 0.5268

Epoch 00011: val_binary_accuracy improved from 0.52570 to 0.52676, saving model to weights.best.hdf5
Epoch 12/50
94379/94379 [==============================] - 221s 2ms/step - loss: 0.6912 - binary_accuracy: 0.5452 - val_loss: 0.6921 - val_binary_accuracy: 0.5305

Epoch 00012: val_binary_accuracy improved from 0.52676 to 0.53045, saving model to weights.best.hdf5
Epoch 13/50
94379/94379 [==============================] - 225s 2ms/step - loss: 0.6910 - binary_accuracy: 0.5499 - val_loss: 0.6920 - val_binary_accuracy: 0.5367

Epoch 00013: val_binary_accuracy improved from 0.53045 to 0.53672, saving model to weights.best.hdf5
Epoch 14/50
94379/94379 [==============================] - 225s 2ms/step - loss: 0.6908 - binary_accuracy: 0.5548 - val_loss: 0.6919 - val_binary_accuracy: 0.5364

Epoch 00014: val_binary_accuracy did not improve from 0.53672
Epoch 15/50
94379/94379 [==============================] - 227s 2ms/step - loss: 0.6905 - binary_accuracy: 0.5597 - val_loss: 0.6918 - val_binary_accuracy: 0.5365

Epoch 00015: val_binary_accuracy did not improve from 0.53672
Epoch 16/50
94379/94379 [==============================] - 225s 2ms/step - loss: 0.6903 - binary_accuracy: 0.5593 - val_loss: 0.6917 - val_binary_accuracy: 0.5269

Epoch 00016: val_binary_accuracy did not improve from 0.53672
Epoch 17/50
94379/94379 [==============================] - 224s 2ms/step - loss: 0.6900 - binary_accuracy: 0.5689 - val_loss: 0.6916 - val_binary_accuracy: 0.5325

Epoch 00017: val_binary_accuracy did not improve from 0.53672
Epoch 18/50
94379/94379 [==============================] - 225s 2ms/step - loss: 0.6898 - binary_accuracy: 0.5635 - val_loss: 0.6914 - val_binary_accuracy: 0.5405

Epoch 00018: val_binary_accuracy improved from 0.53672 to 0.54050, saving model to weights.best.hdf5
Epoch 19/50
94379/94379 [==============================] - 228s 2ms/step - loss: 0.6895 - binary_accuracy: 0.5683 - val_loss: 0.6912 - val_binary_accuracy: 0.5461

Epoch 00019: val_binary_accuracy improved from 0.54050 to 0.54613, saving model to weights.best.hdf5
Epoch 20/50
94379/94379 [==============================] - 227s 2ms/step - loss: 0.6893 - binary_accuracy: 0.5726 - val_loss: 0.6912 - val_binary_accuracy: 0.5350

Epoch 00020: val_binary_accuracy did not improve from 0.54613
Epoch 21/50
94379/94379 [==============================] - 227s 2ms/step - loss: 0.6890 - binary_accuracy: 0.5741 - val_loss: 0.6912 - val_binary_accuracy: 0.5302

Epoch 00021: val_binary_accuracy did not improve from 0.54613
Epoch 22/50
94379/94379 [==============================] - 225s 2ms/step - loss: 0.6888 - binary_accuracy: 0.5770 - val_loss: 0.6909 - val_binary_accuracy: 0.5407

Epoch 00022: val_binary_accuracy did not improve from 0.54613
Epoch 23/50
94379/94379 [==============================] - 224s 2ms/step - loss: 0.6885 - binary_accuracy: 0.5847 - val_loss: 0.6910 - val_binary_accuracy: 0.5245

Epoch 00023: val_binary_accuracy did not improve from 0.54613
Epoch 24/50
94379/94379 [==============================] - 223s 2ms/step - loss: 0.6882 - binary_accuracy: 0.5870 - val_loss: 0.6906 - val_binary_accuracy: 0.5453

Epoch 00024: val_binary_accuracy did not improve from 0.54613
Epoch 25/50
94379/94379 [==============================] - 225s 2ms/step - loss: 0.6879 - binary_accuracy: 0.5884 - val_loss: 0.6904 - val_binary_accuracy: 0.5605

Epoch 00025: val_binary_accuracy improved from 0.54613 to 0.56046, saving model to weights.best.hdf5
Epoch 26/50
94379/94379 [==============================] - 224s 2ms/step - loss: 0.6876 - binary_accuracy: 0.5933 - val_loss: 0.6903 - val_binary_accuracy: 0.5488

Epoch 00026: val_binary_accuracy did not improve from 0.56046
Epoch 27/50
94379/94379 [==============================] - 227s 2ms/step - loss: 0.6873 - binary_accuracy: 0.5893 - val_loss: 0.6902 - val_binary_accuracy: 0.5458

Epoch 00027: val_binary_accuracy did not improve from 0.56046
Epoch 28/50
94379/94379 [==============================] - 232s 2ms/step - loss: 0.6870 - binary_accuracy: 0.5956 - val_loss: 0.6899 - val_binary_accuracy: 0.5666

Epoch 00028: val_binary_accuracy improved from 0.56046 to 0.56656, saving model to weights.best.hdf5
Epoch 29/50
94379/94379 [==============================] - 242s 3ms/step - loss: 0.6867 - binary_accuracy: 0.5931 - val_loss: 0.6899 - val_binary_accuracy: 0.5488

Epoch 00029: val_binary_accuracy did not improve from 0.56656
Epoch 30/50
94379/94379 [==============================] - 221s 2ms/step - loss: 0.6864 - binary_accuracy: 0.6007 - val_loss: 0.6898 - val_binary_accuracy: 0.5488

Epoch 00030: val_binary_accuracy did not improve from 0.56656
Epoch 31/50
94379/94379 [==============================] - 216s 2ms/step - loss: 0.6861 - binary_accuracy: 0.6031 - val_loss: 0.6895 - val_binary_accuracy: 0.5652

Epoch 00031: val_binary_accuracy did not improve from 0.56656
Epoch 32/50
94379/94379 [==============================] - 219s 2ms/step - loss: 0.6856 - binary_accuracy: 0.5950 - val_loss: 0.6894 - val_binary_accuracy: 0.5610

Epoch 00032: val_binary_accuracy did not improve from 0.56656
Epoch 33/50
94379/94379 [==============================] - 218s 2ms/step - loss: 0.6854 - binary_accuracy: 0.6110 - val_loss: 0.6892 - val_binary_accuracy: 0.5576

Epoch 00033: val_binary_accuracy did not improve from 0.56656
Epoch 34/50
94379/94379 [==============================] - 223s 2ms/step - loss: 0.6850 - binary_accuracy: 0.6072 - val_loss: 0.6889 - val_binary_accuracy: 0.5652

Epoch 00034: val_binary_accuracy did not improve from 0.56656
Epoch 35/50
94379/94379 [==============================] - 212s 2ms/step - loss: 0.6846 - binary_accuracy: 0.6122 - val_loss: 0.6887 - val_binary_accuracy: 0.5711

Epoch 00035: val_binary_accuracy improved from 0.56656 to 0.57114, saving model to weights.best.hdf5
Epoch 36/50
94379/94379 [==============================] - 213s 2ms/step - loss: 0.6842 - binary_accuracy: 0.6143 - val_loss: 0.6890 - val_binary_accuracy: 0.5313

Epoch 00036: val_binary_accuracy did not improve from 0.57114
Epoch 37/50
94379/94379 [==============================] - 214s 2ms/step - loss: 0.6839 - binary_accuracy: 0.6184 - val_loss: 0.6882 - val_binary_accuracy: 0.5791

Epoch 00037: val_binary_accuracy improved from 0.57114 to 0.57911, saving model to weights.best.hdf5
Epoch 38/50
94379/94379 [==============================] - 218s 2ms/step - loss: 0.6834 - binary_accuracy: 0.6120 - val_loss: 0.6880 - val_binary_accuracy: 0.5786

Epoch 00038: val_binary_accuracy did not improve from 0.57911
Epoch 39/50
94379/94379 [==============================] - 214s 2ms/step - loss: 0.6830 - binary_accuracy: 0.6227 - val_loss: 0.6879 - val_binary_accuracy: 0.5693

Epoch 00039: val_binary_accuracy did not improve from 0.57911
Epoch 40/50
94379/94379 [==============================] - 217s 2ms/step - loss: 0.6825 - binary_accuracy: 0.6237 - val_loss: 0.6876 - val_binary_accuracy: 0.5808

Epoch 00040: val_binary_accuracy improved from 0.57911 to 0.58080, saving model to weights.best.hdf5
Epoch 41/50
94379/94379 [==============================] - 218s 2ms/step - loss: 0.6820 - binary_accuracy: 0.6234 - val_loss: 0.6873 - val_binary_accuracy: 0.5834

Epoch 00041: val_binary_accuracy improved from 0.58080 to 0.58343, saving model to weights.best.hdf5
Epoch 42/50
94379/94379 [==============================] - 218s 2ms/step - loss: 0.6816 - binary_accuracy: 0.6303 - val_loss: 0.6872 - val_binary_accuracy: 0.5618

Epoch 00042: val_binary_accuracy did not improve from 0.58343
Epoch 43/50
94379/94379 [==============================] - 218s 2ms/step - loss: 0.6811 - binary_accuracy: 0.6274 - val_loss: 0.6868 - val_binary_accuracy: 0.5753

Epoch 00043: val_binary_accuracy did not improve from 0.58343
Epoch 44/50
94379/94379 [==============================] - 219s 2ms/step - loss: 0.6806 - binary_accuracy: 0.6356 - val_loss: 0.6865 - val_binary_accuracy: 0.5768

Epoch 00044: val_binary_accuracy did not improve from 0.58343
Epoch 45/50
94379/94379 [==============================] - 223s 2ms/step - loss: 0.6801 - binary_accuracy: 0.6296 - val_loss: 0.6869 - val_binary_accuracy: 0.5393

Epoch 00045: val_binary_accuracy did not improve from 0.58343
Epoch 46/50
94379/94379 [==============================] - 2505s 27ms/step - loss: 0.6795 - binary_accuracy: 0.6349 - val_loss: 0.6866 - val_binary_accuracy: 0.5399

Epoch 00046: val_binary_accuracy did not improve from 0.58343
Epoch 47/50
94379/94379 [==============================] - 201s 2ms/step - loss: 0.6788 - binary_accuracy: 0.6294 - val_loss: 0.6863 - val_binary_accuracy: 0.5417

Epoch 00047: val_binary_accuracy did not improve from 0.58343
Epoch 48/50
94379/94379 [==============================] - 207s 2ms/step - loss: 0.6782 - binary_accuracy: 0.6312 - val_loss: 0.6852 - val_binary_accuracy: 0.5730

Epoch 00048: val_binary_accuracy did not improve from 0.58343
Epoch 49/50
94379/94379 [==============================] - 215s 2ms/step - loss: 0.6776 - binary_accuracy: 0.6383 - val_loss: 0.6854 - val_binary_accuracy: 0.5507

Epoch 00049: val_binary_accuracy did not improve from 0.58343
Epoch 50/50
94379/94379 [==============================] - 213s 2ms/step - loss: 0.6769 - binary_accuracy: 0.6342 - val_loss: 0.6845 - val_binary_accuracy: 0.5728

Epoch 00050: val_binary_accuracy did not improve from 0.58343
In [28]:
import matplotlib.pyplot as plt

plt.figure(figsize=(20,15))
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model Loss', fontsize = 20)
plt.ylabel('Loss', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

plt.figure(figsize=(20,15))
plt.plot(history.history['binary_accuracy'])
plt.plot(history.history['val_binary_accuracy'])
plt.title('Model Accuracy', fontsize = 20)
plt.ylabel('Accuracy', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()
In [29]:
from sklearn.metrics import confusion_matrix
import itertools

plt.figure(figsize=(15,10))

predicted_labels = model.predict(X_test)
cm = confusion_matrix(y_test, [np.round(i[0]) for i in predicted_labels])
print('Confusion matrix:\n',cm)

cm = cm.astype('float') / cm.sum(axis = 1)[:, np.newaxis]

plt.imshow(cm, cmap = plt.cm.Blues)
plt.title('Normalized confusion matrix', fontsize = 20)
plt.colorbar()
plt.xlabel('True label', fontsize = 20)
plt.ylabel('Predicted label', fontsize = 20)
for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
    plt.text(j, i, format(cm[i, j], '.2f'),
             horizontalalignment = 'center', verticalalignment = 'center', fontsize = 20,
             color='white' if cm[i, j] > 0.5 else 'black')
plt.show()
Confusion matrix:
 [[ 6318  8595]
 [ 3905 10676]]
In [30]:
scores = model.evaluate(X_test, y_test, verbose=0)
print("Accuracy: %.2f%%" % (scores[1]*100))
Accuracy: 57.62%

LSTM + Embedding Layer for Neanderthal Introgressed vs. Depleted Sequence Classification

Bag of Words is a good model that compares K-mer frequencies between Neanderthal introgressed vs. depleted regions, it achieves quite a high accuracy of sequence classification. However, it does not take connections between the words / K-mers into account. We can do a more advanced classification using LSTM with special Embedding Layer in order to incorporate more memory into our model, so that the words / K-mers remember their order in the sentence / sequence. The Embedding layer provides a way to learn word embeddings / numeric representations while performing the classification task. We again start with reading the sequences from the two fasta-files (introgressed and depleted regions), split them into words / K-mers and build sentences out of them.

In [1]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    #cutoff = 200
    #my_intr_seq = str(intr.seq)[0:cutoff]
    #my_depl_seq = str(depl.seq)[0:cutoff]
    #intr_seqs.append(my_intr_seq)
    #depl_seqs.append(my_depl_seq)
    
    step = 200; jump = 1; a = 0; b = step; n_jumps = 5
    for j in range(n_jumps):
        s_intr = str(intr.seq)[a:b]
        s_depl = str(depl.seq)[a:b]
        intr_seqs.append(s_intr)
        depl_seqs.append(s_depl)
        a = a + jump
        b = a + step
    
    e = e + 1
    if e%20000 == 0:
        print('Finished ' + str(e) + ' entries')
        
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]

kmer = 10
intr_texts = [' '.join(getKmers(i, kmer)) for i in intr_seqs]
depl_texts = [' '.join(getKmers(i, kmer)) for i in depl_seqs]
Finished 20000 entries
Finished 40000 entries
Finished 60000 entries
In [2]:
merge_texts = intr_texts + depl_texts
len(merge_texts)
Out[2]:
737340
In [3]:
import numpy as np
labels = list(np.ones(len(intr_texts))) + list(np.zeros(len(depl_texts)))
print(len(labels))
737340
In [5]:
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.feature_extraction.text import TfidfTransformer

import warnings
warnings.filterwarnings('ignore')

#cv = CountVectorizer()
#X = cv.fit_transform(merge_texts)

#tfidf_transformer = TfidfTransformer()
#X = tfidf_transformer.fit_transform(X)

tokenizer = Tokenizer()
tokenizer.fit_on_texts(merge_texts)
#X = tokenizer.texts_to_matrix(merge_texts, mode = 'freq')

encoded_docs = tokenizer.texts_to_sequences(merge_texts)
max_length = max([len(s.split()) for s in merge_texts])
X = pad_sequences(encoded_docs, maxlen = max_length, padding = 'post')

print(X)
print('\n')
print(X.shape)
[[ 63302 207900 269566 ... 100801 127303 250471]
 [207900 269566 319101 ... 127303 250471 261360]
 [269566 319101 125927 ... 250471 261360 543311]
 ...
 [411966 308608 194078 ... 114078 325569 333197]
 [308608 194078 228176 ... 325569 333197 149901]
 [194078 228176  64264 ... 333197 149901 368428]]


(737340, 191)
In [6]:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, labels, test_size = 0.20, random_state = 42)
In [7]:
print(X_train.shape)
print(X_test.shape)
(589872, 191)
(147468, 191)
In [8]:
max_length = max([len(s.split()) for s in merge_texts])
print(max_length)
191
In [9]:
vocab_size = len(tokenizer.word_index) + 1
print(vocab_size)
964114
In [10]:
from keras.models import Sequential
from keras.callbacks import ModelCheckpoint
from keras.optimizers import SGD, Adam, Adadelta, RMSprop
from keras.layers import Conv1D, Dense, MaxPooling1D, Flatten, Dropout
from keras.layers import Embedding, GlobalAveragePooling1D, LSTM, SimpleRNN, GRU, Bidirectional

model = Sequential()
model.add(Embedding(vocab_size, 10)) #dropout = 0.2 #input_length = max_length
#model.add(Conv1D(filters = 16, kernel_size = 5, padding = 'same', activation = 'relu'))
#model.add(MaxPooling1D(pool_size = 2))
#model.add(LSTM(10)) #dropout = 0.2, recurrent_dropout = 0.2
model.add(Bidirectional(LSTM(10))) #dropout = 0.2, recurrent_dropout = 0.2
#model.add(Bidirectional(SimpleRNN(10)))
#model.add(GRU(10))
#model.add(SimpleRNN(10, dropout = 0.2, recurrent_dropout = 0.2))
#model.add(Flatten())
model.add(Dense(10, activation = 'relu'))
model.add(Dense(1, activation = 'sigmoid'))

epochs = 5
model.compile(loss = 'binary_crossentropy', optimizer = 'rmsprop', metrics = ['accuracy'])
#model.compile(loss = 'binary_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
#model.compile(loss = 'binary_crossentropy', optimizer = 'SGD', metrics = ['accuracy'])
#model.compile(loss = 'binary_crossentropy', optimizer = RMSprop(lr = 0.0001), metrics = ['accuracy'])
checkpoint = ModelCheckpoint("weights.best.hdf5", monitor = 'val_acc', verbose = 1, 
                             save_best_only = True, mode = 'max')
print(model.summary())
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_2 (Embedding)      (None, None, 10)          9641140   
_________________________________________________________________
bidirectional_2 (Bidirection (None, 20)                1680      
_________________________________________________________________
dense_3 (Dense)              (None, 10)                210       
_________________________________________________________________
dense_4 (Dense)              (None, 1)                 11        
=================================================================
Total params: 9,643,041
Trainable params: 9,643,041
Non-trainable params: 0
_________________________________________________________________
None
In [12]:
import warnings
warnings.filterwarnings('ignore')

history = model.fit(X_train, y_train, 
                    epochs = epochs, verbose = 1, validation_split = 0.2, batch_size = 32, shuffle = True, 
                    callbacks = [checkpoint])
Train on 471897 samples, validate on 117975 samples
Epoch 1/5
471897/471897 [==============================] - 3035s 6ms/step - loss: 0.2911 - acc: 0.8573 - val_loss: 0.0965 - val_acc: 0.9636

Epoch 00001: val_acc improved from -inf to 0.96362, saving model to weights.best.hdf5
Epoch 2/5
471897/471897 [==============================] - 5225s 11ms/step - loss: 0.0371 - acc: 0.9869 - val_loss: 0.0850 - val_acc: 0.9716

Epoch 00002: val_acc improved from 0.96362 to 0.97165, saving model to weights.best.hdf5
Epoch 3/5
471897/471897 [==============================] - 3116s 7ms/step - loss: 0.0133 - acc: 0.9957 - val_loss: 0.0394 - val_acc: 0.9888

Epoch 00003: val_acc improved from 0.97165 to 0.98881, saving model to weights.best.hdf5
Epoch 4/5
471897/471897 [==============================] - 3172s 7ms/step - loss: 0.0055 - acc: 0.9984 - val_loss: 0.0388 - val_acc: 0.9915

Epoch 00004: val_acc improved from 0.98881 to 0.99149, saving model to weights.best.hdf5
Epoch 5/5
471897/471897 [==============================] - 3160s 7ms/step - loss: 0.0026 - acc: 0.9994 - val_loss: 0.0393 - val_acc: 0.9941

Epoch 00005: val_acc improved from 0.99149 to 0.99411, saving model to weights.best.hdf5
In [14]:
import matplotlib.pyplot as plt

plt.figure(figsize=(20,15))
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model Loss', fontsize = 20)
plt.ylabel('Loss', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()

plt.figure(figsize=(20,15))
plt.plot(history.history['acc'])
plt.plot(history.history['val_acc'])
plt.title('Model Accuracy', fontsize = 20)
plt.ylabel('Accuracy', fontsize = 20)
plt.xlabel('Epoch', fontsize = 20)
plt.legend(['Train', 'Validation'], fontsize = 20)
plt.show()
In [15]:
model.load_weights("weights.best.hdf5")
model.compile(loss = 'binary_crossentropy', optimizer = 'rmsprop', metrics = ['accuracy'])
In [16]:
from sklearn.metrics import confusion_matrix
import matplotlib.pyplot as plt
import itertools

plt.figure(figsize = (15,10))

predicted_labels = model.predict(X_test)
cm = confusion_matrix(y_test, [np.round(i[0]) for i in predicted_labels])
print('Confusion matrix:\n',cm)

cm = cm.astype('float') / cm.sum(axis = 1)[:, np.newaxis]

plt.imshow(cm, cmap = plt.cm.Blues)
plt.title('Normalized confusion matrix', fontsize = 20)
plt.colorbar()
plt.xlabel('True label', fontsize = 20)
plt.ylabel('Predicted label', fontsize = 20)
for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
    plt.text(j, i, format(cm[i, j], '.2f'),
             horizontalalignment = 'center', verticalalignment = 'center', fontsize = 20,
             color='white' if cm[i, j] > 0.5 else 'black')
plt.show()
Confusion matrix:
 [[73377   482]
 [  315 73294]]
In [17]:
scores = model.evaluate(X_test, y_test, verbose = 0)
print("Accuracy: %.2f%%" % (scores[1]*100))
Accuracy: 99.46%

Now let us visualize the word embeddings from the Embedding layer.

In [18]:
e = model.layers[0]
weights = e.get_weights()[0]
print(weights.shape) # shape: (vocab_size, embedding_dim)
(964114, 10)
In [30]:
words = [i.upper() for i in list(tokenizer.index_word.values())]
words[0:5]
Out[30]:
['TTTTTTTTTT', 'AAAAAAAAAA', 'TGTGTGTGTG', 'GTGTGTGTGT', 'CACACACACA']
In [32]:
import os
import io

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr')

out_v = io.open('vecs.tsv', 'w', encoding='utf-8')
out_m = io.open('meta.tsv', 'w', encoding='utf-8')

for num, word in enumerate(words):
  vec = weights[num + 1] # skip 0, it's padding.
  out_m.write(word + "\n")
  out_v.write('\t'.join([str(x) for x in vec]) + "\n")
out_v.close()
out_m.close()

For visualization we will be using Embedding Projector from here http://projector.tensorflow.org/ by following the tutorial https://www.tensorflow.org/tutorials/text/word_embeddings:

In [33]:
from IPython.display import Image
path = '/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/'
Image(path + 'WordEmbeddings.gif.png', width=2000)
Out[33]:

Here we will use chi-square test for reanking most important k-mers / words in the two texts: Neanderthal introgressed vs. depleted segments / sentences.

In [4]:
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.feature_extraction.text import TfidfTransformer

import warnings
warnings.filterwarnings('ignore')

cv = CountVectorizer()
X_cv = cv.fit_transform(merge_texts)

tfidf_transformer = TfidfTransformer()
X_tfidf = tfidf_transformer.fit_transform(X_cv)
In [7]:
import matplotlib.pyplot as plt
from sklearn.feature_selection import chi2

chi2score = chi2(X_tfidf, labels)[0]

plt.figure(figsize = (20, 15))
scores = list(zip([i.upper() for i in cv.get_feature_names()], chi2score))
chi2 = sorted(scores, key = lambda x: x[1])
topchi2 = list(zip(*chi2[-20:]))
x = range(len(topchi2[1]))
labels = topchi2[0]
plt.barh(x, topchi2[1], align = 'center', alpha = 0.5)
plt.plot(topchi2[1], x, '-o', markersize = 5, alpha = 0.8)
plt.yticks(x, labels)
plt.xlabel('$\chi^2$')
plt.show()

Now I am going to shuffle each position in a sentence across all sentences, so in this way I find most important word positions averaged across all sentences and then I can simply check k-mer counts in those important word positions, this gives some idea which k-mers (not positions) are most informative.

In [11]:
from keras.models import Sequential
from keras.callbacks import ModelCheckpoint
from keras.optimizers import SGD, Adam, Adadelta, RMSprop
from keras.layers import Conv1D, Dense, MaxPooling1D, Flatten, Dropout
from keras.layers import Embedding, GlobalAveragePooling1D, LSTM, SimpleRNN, GRU, Bidirectional

model = Sequential()
model.add(Embedding(vocab_size, 10))
model.add(Bidirectional(LSTM(10)))
model.add(Dense(10, activation = 'relu'))
model.add(Dense(1, activation = 'sigmoid'))

model.load_weights("LSTM.weights.best.hdf5")
model.compile(loss = 'binary_crossentropy', optimizer = 'rmsprop', metrics = ['accuracy'])
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_2 (Embedding)      (None, None, 10)          9641140   
_________________________________________________________________
bidirectional_2 (Bidirection (None, 20)                1680      
_________________________________________________________________
dense_3 (Dense)              (None, 10)                210       
_________________________________________________________________
dense_4 (Dense)              (None, 1)                 11        
=================================================================
Total params: 9,643,041
Trainable params: 9,643,041
Non-trainable params: 0
_________________________________________________________________
In [12]:
X_test.shape
Out[12]:
(147468, 191)
In [13]:
X_test[0:5,0:5]
Out[13]:
array([[303077,  93490, 177747,  44335,  69215],
       [364595, 217876, 193494,  71357,  11831],
       [194615,  49295, 302291,  74385, 269246],
       [242441, 125472, 309272, 480234, 391093],
       [   974,   1155,    839,    368,    203]], dtype=int32)
In [15]:
import numpy as np
np.savetxt('X_test.txt', X_test)
In [17]:
X_test_read = np.int32(np.loadtxt('X_test.txt')).reshape((147468, 191))
X_test_read[0:5,0:5]
Out[17]:
array([[303077,  93490, 177747,  44335,  69215],
       [364595, 217876, 193494,  71357,  11831],
       [194615,  49295, 302291,  74385, 269246],
       [242441, 125472, 309272, 480234, 391093],
       [   974,   1155,    839,    368,    203]], dtype=int32)
In [18]:
X_test[:,0]
Out[18]:
array([303077, 364595, 194615, ..., 476335,  51956, 129954], dtype=int32)
In [19]:
X_test_orig = X_test
In [20]:
X_test_orig[0:5,0:5]
Out[20]:
array([[303077,  93490, 177747,  44335,  69215],
       [364595, 217876, 193494,  71357,  11831],
       [194615,  49295, 302291,  74385, 269246],
       [242441, 125472, 309272, 480234, 391093],
       [   974,   1155,    839,    368,    203]], dtype=int32)
In [27]:
import random
j = 1
X_test_orig = np.int32(np.loadtxt('X_test.txt')).reshape((147468, 191))
X_test_perm = X_test_orig
X_test_perm[:,j] = random.sample(list(X_test_orig[:,j]), len(list(X_test_orig[:,j])))
X_test_perm[0:5, 0:5]
Out[27]:
array([[303077, 406056, 177747,  44335,  69215],
       [364595, 632588, 193494,  71357,  11831],
       [194615,   1034, 302291,  74385, 269246],
       [242441, 447249, 309272, 480234, 391093],
       [   974, 146994,    839,    368,    203]], dtype=int32)
In [28]:
scores = model.evaluate(X_test_perm, y_test, verbose = 0)
print("Accuracy: %.2f%%" % (scores[1]*100))
Accuracy: 99.45%
In [176]:
#words = [i.upper() for i in list(tokenizer.index_word.values())]
#words[0:5]
In [177]:
#len(words)
In [161]:
#tokenizer.index_word
In [29]:
perm_scores = []
for i in range(X_test.shape[1]):
    print(i)
    X_test_orig = np.int32(np.loadtxt('X_test.txt')).reshape((147468, 191))
    X_test_perm = X_test_orig
    X_test_perm[:,i] = random.sample(list(X_test[:,i]), len(list(X_test[:,i])))
    perm_scores.append(abs(model.evaluate(X_test_perm, y_test, verbose = 0)[1]*100 - 99.46))
    print(perm_scores)
0
[0.014018498928564327]
1
[0.014018498928564327, 0.006559253532969933]
2
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716]
3
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304]
4
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058]
5
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107]
6
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072]
7
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323]
8
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357]
9
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787]
10
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467]
11
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467]
12
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629]
13
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072]
14
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716]
15
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397]
16
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397]
17
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756]
18
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909]
19
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444]
20
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467]
21
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357]
22
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304]
23
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349]
24
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505]
25
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787]
26
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626]
27
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357]
28
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505]
29
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467]
30
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583]
31
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072]
32
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107]
33
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165]
34
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445]
35
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933]
36
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165]
37
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258]
38
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787]
39
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349]
40
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394]
41
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629]
42
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072]
43
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304]
44
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716]
45
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812]
46
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165]
47
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069]
48
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716]
49
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444]
50
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716]
51
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756]
52
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058]
53
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044]
54
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357]
55
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076]
56
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756]
57
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626]
58
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716]
59
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328]
60
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107]
61
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629]
62
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719]
63
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951]
64
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165]
65
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357]
66
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394]
67
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787]
68
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787]
69
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349]
70
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357]
71
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787]
72
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357]
73
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933]
74
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787]
75
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077]
76
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397]
77
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107]
78
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304]
79
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716]
80
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107]
81
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909]
82
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787]
83
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756]
84
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357]
85
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629]
86
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076]
87
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076]
88
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304]
89
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357]
90
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467]
91
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072]
92
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812]
93
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069]
94
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069]
95
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397]
96
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304]
97
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397]
98
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304]
99
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304]
100
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357]
101
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716]
102
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629]
103
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026]
104
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909]
105
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397]
106
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026]
107
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258]
108
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072]
109
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951]
110
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716]
111
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756]
112
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719]
113
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397]
114
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626]
115
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505]
116
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933]
117
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357]
118
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258]
119
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069]
120
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076]
121
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328]
122
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719]
123
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301]
124
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072]
125
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505]
126
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107]
127
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258]
128
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072]
129
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444]
130
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767]
131
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716]
132
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301]
133
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301]
134
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467]
135
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107]
136
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397]
137
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583]
138
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505]
139
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304]
140
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165]
141
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304]
142
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444]
143
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258]
144
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487]
145
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787]
146
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077]
147
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444]
148
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301]
149
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304]
150
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394]
151
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951]
152
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444]
153
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719]
154
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107]
155
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951]
156
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069]
157
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328]
158
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505]
159
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072]
160
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933]
161
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069]
162
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444]
163
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069]
164
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719]
165
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394]
166
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357]
167
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076]
168
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397]
169
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933]
170
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328]
171
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394]
172
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629]
173
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165]
174
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787]
175
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583]
176
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397]
177
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505]
178
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812]
179
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076]
180
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716]
181
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076]
182
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909]
183
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909, 0.003846800661833072]
184
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909, 0.003846800661833072, 0.00045623457292265357]
185
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909, 0.003846800661833072, 0.00045623457292265357, 0.008593593186319026]
186
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909, 0.003846800661833072, 0.00045623457292265357, 0.008593593186319026, 0.004524913879606629]
187
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909, 0.003846800661833072, 0.00045623457292265357, 0.008593593186319026, 0.004524913879606629, 0.00045623457292265357]
188
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909, 0.003846800661833072, 0.00045623457292265357, 0.008593593186319026, 0.004524913879606629, 0.00045623457292265357, 0.00045623457292265357]
189
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909, 0.003846800661833072, 0.00045623457292265357, 0.008593593186319026, 0.004524913879606629, 0.00045623457292265357, 0.00045623457292265357, 0.00045623457292265357]
190
[0.014018498928564327, 0.006559253532969933, 0.0008999918626386716, 0.003168687444045304, 0.01039357691161058, 0.0011343477906962107, 0.003846800661833072, 0.01310602978273323, 0.00045623457292265357, 0.0018124610084839787, 0.0024905742262717467, 0.0024905742262717467, 0.004524913879606629, 0.003846800661833072, 0.0008999918626386716, 0.005203027097394397, 0.005203027097394397, 0.0029343315160019756, 0.010627932839653909, 0.00022187864486511444, 0.0024905742262717467, 0.00045623457292265357, 0.003168687444045304, 0.00723736675074349, 0.0015781050804406505, 0.0018124610084839787, 0.005646784387124626, 0.00045623457292265357, 0.0015781050804406505, 0.0024905742262717467, 0.009271706404092583, 0.003846800661833072, 0.0011343477906962107, 0.005881140315182165, 0.011984159275229445, 0.006559253532969933, 0.005881140315182165, 0.007915479968531258, 0.0018124610084839787, 0.00723736675074349, 0.006324897604912394, 0.004524913879606629, 0.003846800661833072, 0.003168687444045304, 0.0008999918626386716, 0.009715463693822812, 0.005881140315182165, 0.004968671169351069, 0.0008999918626386716, 0.00022187864486511444, 0.0008999918626386716, 0.0029343315160019756, 0.01039357691161058, 0.009037350476035044, 0.00045623457292265357, 0.0022562182982142076, 0.0029343315160019756, 0.005646784387124626, 0.0008999918626386716, 0.0036124447337755328, 0.0011343477906962107, 0.004524913879606629, 0.007681124040473719, 0.007003010822685951, 0.005881140315182165, 0.00045623457292265357, 0.006324897604912394, 0.0018124610084839787, 0.0018124610084839787, 0.00723736675074349, 0.00045623457292265357, 0.0018124610084839787, 0.00045623457292265357, 0.006559253532969933, 0.0018124610084839787, 0.01334038571079077, 0.005203027097394397, 0.0011343477906962107, 0.003168687444045304, 0.0008999918626386716, 0.0011343477906962107, 0.010627932839653909, 0.0018124610084839787, 0.0029343315160019756, 0.00045623457292265357, 0.004524913879606629, 0.0022562182982142076, 0.0022562182982142076, 0.003168687444045304, 0.00045623457292265357, 0.0024905742262717467, 0.003846800661833072, 0.009715463693822812, 0.004968671169351069, 0.004968671169351069, 0.005203027097394397, 0.003168687444045304, 0.005203027097394397, 0.003168687444045304, 0.003168687444045304, 0.00045623457292265357, 0.0008999918626386716, 0.004524913879606629, 0.008593593186319026, 0.010627932839653909, 0.005203027097394397, 0.008593593186319026, 0.007915479968531258, 0.003846800661833072, 0.007003010822685951, 0.0008999918626386716, 0.0029343315160019756, 0.007681124040473719, 0.005203027097394397, 0.005646784387124626, 0.0015781050804406505, 0.006559253532969933, 0.00045623457292265357, 0.007915479968531258, 0.004968671169351069, 0.0022562182982142076, 0.0036124447337755328, 0.007681124040473719, 0.004290557951563301, 0.003846800661833072, 0.0015781050804406505, 0.0011343477906962107, 0.007915479968531258, 0.003846800661833072, 0.00022187864486511444, 0.014462256218308767, 0.0008999918626386716, 0.004290557951563301, 0.004290557951563301, 0.0024905742262717467, 0.0011343477906962107, 0.005203027097394397, 0.009271706404092583, 0.0015781050804406505, 0.003168687444045304, 0.005881140315182165, 0.003168687444045304, 0.00022187864486511444, 0.007915479968531258, 0.008359237258261487, 0.0018124610084839787, 0.01334038571079077, 0.00022187864486511444, 0.004290557951563301, 0.003168687444045304, 0.006324897604912394, 0.007003010822685951, 0.00022187864486511444, 0.007681124040473719, 0.0011343477906962107, 0.007003010822685951, 0.004968671169351069, 0.0036124447337755328, 0.0015781050804406505, 0.003846800661833072, 0.006559253532969933, 0.004968671169351069, 0.00022187864486511444, 0.004968671169351069, 0.007681124040473719, 0.006324897604912394, 0.00045623457292265357, 0.0022562182982142076, 0.005203027097394397, 0.006559253532969933, 0.0036124447337755328, 0.006324897604912394, 0.004524913879606629, 0.005881140315182165, 0.0018124610084839787, 0.009271706404092583, 0.005203027097394397, 0.0015781050804406505, 0.009715463693822812, 0.0022562182982142076, 0.0008999918626386716, 0.0022562182982142076, 0.010627932839653909, 0.003846800661833072, 0.00045623457292265357, 0.008593593186319026, 0.004524913879606629, 0.00045623457292265357, 0.00045623457292265357, 0.00045623457292265357, 0.009037350476035044]
In [32]:
import matplotlib.pyplot as plt
plt.figure(figsize=[16,5])
barlist = plt.bar(np.arange(len(perm_scores)), perm_scores)
plt.xlabel('Bases')
plt.ylabel('Magnitude of saliency values')
#plt.xticks(np.arange(len(sal)), list(sequences[sequence_index]));
plt.title('Saliency map for bases in one of the ancient sequences')
plt.show()
In [55]:
import pandas as pd
scores_df = pd.DataFrame({'Base': range(len(perm_scores)),'Score': perm_scores})
scores_df.to_csv('scores_df.txt', index = False, sep = '\t')
scores_df[scores_df['Score'] > 0.01]
Out[55]:
Base Score
0 0 0.014018
4 4 0.010394
7 7 0.013106
18 18 0.010628
34 34 0.011984
52 52 0.010394
75 75 0.013340
81 81 0.010628
104 104 0.010628
130 130 0.014462
146 146 0.013340
182 182 0.010628
In [58]:
informative_indeces = list(scores_df[scores_df['Score'] > 0.01].index)
informative_indeces
Out[58]:
[0, 4, 7, 18, 34, 52, 75, 81, 104, 130, 146, 182]
In [59]:
X_test = np.int32(np.loadtxt('X_test.txt')).reshape((147468, 191))
X_test[0:5,0:5]
Out[59]:
array([[303077,  93490, 177747,  44335,  69215],
       [364595, 217876, 193494,  71357,  11831],
       [194615,  49295, 302291,  74385, 269246],
       [242441, 125472, 309272, 480234, 391093],
       [   974,   1155,    839,    368,    203]], dtype=int32)
In [60]:
X_test.shape
Out[60]:
(147468, 191)
In [61]:
X_test[informative_indeces, 0:5]
Out[61]:
array([[303077,  93490, 177747,  44335,  69215],
       [   974,   1155,    839,    368,    203],
       [542875,  54340, 196763, 170736, 311169],
       [469459, 394526, 598262, 486990, 457686],
       [360562, 383467, 217220, 297440, 105521],
       [755854, 654648, 604365, 572152, 648361],
       [775876, 599853, 587304, 620671, 580357],
       [191451, 376984,  68107, 174809,  95887],
       [   298,    318,     85,    214,    517],
       [ 54917, 139078, 240735,  72132,  88915],
       [ 54880,  18372,   8278, 143712,  66992],
       [152302,  37532,  39964,  84685,  91864]], dtype=int32)
In [62]:
reverse_word_map = dict(map(reversed, tokenizer.word_index.items()))
def sequence_to_text(list_of_indices):
    return [reverse_word_map.get(letter) for letter in list_of_indices]
my_texts = list(map(sequence_to_text, X_test[informative_indeces, :]))
my_texts[0][0:5]
Out[62]:
['tagatgctaa', 'agatgctaaa', 'gatgctaaaa', 'atgctaaaat', 'tgctaaaatt']
In [63]:
len([item for sublist in my_texts for item in sublist])
Out[63]:
2292
In [65]:
from collections import Counter
import matplotlib.pyplot as plt
fig = plt.figure(figsize = (20, 15))

D = dict(Counter([item.upper() for sublist in my_texts for item in sublist]).most_common(20))
plt.bar(range(len(D)), list(D.values()), align = 'center')
plt.title('Most Predictive K-mers', fontsize = 20)
plt.ylabel("Counts", fontsize = 20)
plt.xticks(rotation = 90)
plt.xticks(range(len(D)), list(D.keys()), fontsize = 20)
plt.show()
In [4]:
print('Building Neanderthal introgressed sequences')
intr_sentences = []
for i in range(len(intr_seqs)):
    intr_sentences.append(getKmers(intr_seqs[i], kmer))

print('Building Neanderthal depleted sequences')
depl_sentences = []
for i in range(len(depl_seqs)):
    depl_sentences.append(getKmers(depl_seqs[i], kmer))
Building Neanderthal introgressed sequences
Building Neanderthal depleted sequences
In [7]:
from collections import Counter
import matplotlib.pyplot as plt
fig = plt.figure(figsize = (20, 18))
fig.subplots_adjust(hspace = 0.6, wspace = 0.6)

plt.subplot(2, 1, 1)
D = dict(Counter([item for sublist in intr_sentences for item in sublist]).most_common(20))
plt.bar(range(len(D)), list(D.values()), align='center')
plt.title('Most Common K-mers for Neanderthal Introgressed Regions', fontsize = 20)
plt.ylabel("Counts", fontsize = 20)
plt.xticks(rotation = 90)
plt.xticks(range(len(D)), list(D.keys()), fontsize = 20)

plt.subplot(2, 1, 2)
D = dict(Counter([item for sublist in depl_sentences for item in sublist]).most_common(20))
plt.bar(range(len(D)), list(D.values()), align='center')
plt.title('Most Common K-mers for Neanderthal Depleted Regions', fontsize = 20)
plt.ylabel("Counts", fontsize = 20)
plt.xticks(rotation = 90)
plt.xticks(range(len(D)), list(D.keys()), fontsize = 20)

plt.show()

We can see that all feature importances methods point at AT-rich (especially A-repeats) k-mers as most predictive for Neanderthal introgressed vs. depleted regions classification. Now we will us the trained and saved LSTM model for generating predictions for human genes, similarly to what we did with Random Forest. We start with reading the gene sequences and pre-processing them with the Tokenizer in the same way as the Neanderthal introgressed and depleted regions were prepared.

In [2]:
from Bio import SeqIO

gene_file = 'hg19_gene_clean.fa'

e = 0
gene_seqs = []
gene_ids = []
for gene in SeqIO.parse('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/Genes/' + gene_file, 
                        'fasta'):
    #step = 200; jump = 1; a = 0; b = step; n_jumps = 5
    
    cutoff = 200
    if len(str(gene.seq)) < cutoff:
        continue
    
    #if len(str(gene.seq)) < (step + 1) * n_jumps:
    #    continue
    
    gene_ids.append(str(gene.id))
    s_gene = str(gene.seq)[0:cutoff]
    gene_seqs.append(s_gene)
    
    #for j in range(n_jumps):
    #    s_gene = str(gene.seq)[a:b]
    #    gene_seqs.append(s_gene)
    #    a = a + jump
    #    b = a + step
    
    e = e + 1
    if e%10000 == 0:
        print('Finished ' + str(e) + ' genes')

def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]

kmer = 10
gene_texts = [' '.join(getKmers(i, kmer)) for i in gene_seqs]
Finished 10000 genes
Finished 20000 genes
Finished 30000 genes
In [4]:
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.feature_extraction.text import TfidfTransformer

import warnings
warnings.filterwarnings('ignore')

tokenizer = Tokenizer()
tokenizer.fit_on_texts(gene_texts)

encoded_docs = tokenizer.texts_to_sequences(gene_texts)
max_length = max([len(s.split()) for s in gene_texts])
X_gene = pad_sequences(encoded_docs, maxlen = max_length, padding = 'post')

print(X_gene)
print('\n')
print(X_gene.shape)
[[214889  52844  52845 ... 134126 182650 300147]
 [214902  52854  59690 ...  12510   9570  52859]
 [214902  52854  59690 ...  12510   9570  52859]
 ...
 [728028 728029 330655 ...  91439 189505 222366]
 [  2363   3178   5022 ...    401    318    346]
 [214308 130470  41954 ... 156236  96659 268170]]


(31304, 191)
In [5]:
vocab_size = len(tokenizer.word_index) + 1
print(vocab_size)
878635
In [7]:
import os
from keras.models import Sequential
from keras.callbacks import ModelCheckpoint
from keras.optimizers import SGD, Adam, Adadelta, RMSprop
from keras.layers import Conv1D, Dense, MaxPooling1D, Flatten, Dropout
from keras.layers import Embedding, GlobalAveragePooling1D, LSTM, SimpleRNN, GRU, Bidirectional

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

model = Sequential()
model.add(Embedding(964114, 10))
model.add(Bidirectional(LSTM(10)))
model.add(Dense(10, activation = 'relu'))
model.add(Dense(1, activation = 'sigmoid'))

model.load_weights("LSTM.weights.best.hdf5")
model.compile(loss = 'binary_crossentropy', optimizer = 'rmsprop', metrics = ['accuracy'])
model.summary()
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_2 (Embedding)      (None, None, 10)          9641140   
_________________________________________________________________
bidirectional_2 (Bidirection (None, 20)                1680      
_________________________________________________________________
dense_3 (Dense)              (None, 10)                210       
_________________________________________________________________
dense_4 (Dense)              (None, 1)                 11        
=================================================================
Total params: 9,643,041
Trainable params: 9,643,041
Non-trainable params: 0
_________________________________________________________________
In [15]:
gene_predictions = model.predict_classes(X_gene)
gene_predictions_prob = model.predict_proba(X_gene)
In [16]:
X_gene.shape
Out[16]:
(31304, 191)
In [20]:
gene_predictions.shape
Out[20]:
(31304, 1)
In [21]:
gene_predictions_prob.shape
Out[21]:
(31304, 1)
In [19]:
import numpy as np
print(np.sum(gene_predictions == 0))
print(np.sum(gene_predictions == 1))
22143
9161
In [46]:
np.sum(gene_predictions_prob>0.99)
Out[46]:
4905
In [25]:
import os
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
with open('gene_ids_LSTM.txt', 'w') as f:
    for item in gene_ids:
        f.write("%s\n" % item)
In [26]:
import os
os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')
gene_ids = []
gene_symbol = []
with open('gene_ids_LSTM.txt','r') as fin:
    for line in fin:
        line = line.split('\t')
        gene_ids.append(line[0])
        gene_symbol.append(line[1].rstrip())
In [54]:
import pandas as pd
gene_pred_df = pd.DataFrame({'Gene': gene_ids, 'Gene_Symbol': gene_symbol, 
                             'Predict': list(gene_predictions.flatten()), 
                             'Prob': list(gene_predictions_prob.flatten())})
gene_pred_df = gene_pred_df.sort_values(['Prob'], ascending = False)
gene_pred_df[(gene_pred_df['Predict'] == 1) & (gene_pred_df['Prob'] > 0.8)]
Out[54]:
Gene Gene_Symbol Predict Prob
7360 chr12:120884241-120901556 GATC 1 1.000000
23845 chr6:73844526-73853237 KCNQ5-AS1 1 1.000000
20238 chr4:40194587-40246384 RHOH 1 1.000000
2022 chr1:174769035-174964445 RABGAP1L 1 1.000000
2024 chr1:174904084-174923398 LOC101928696 1 1.000000
8518 chr14:75230069-75304013 YLPM1 1 1.000000
2068 chr1:180199433-180244188 LHX4 1 1.000000
8408 chr14:61176256-61190852 SIX4 1 1.000000
2107 chr1:186369704-186370587 OCLM 1 1.000000
2114 chr1:187412760-187446354 LINC01037 1 1.000000
8219 chr14:24641234-24649463 REC8 1 1.000000
8198 chr14:24422944-24438488 DHRS4 1 1.000000
8109 chr14:20779527-20801471 CCNB1IP1 1 1.000000
2204 chr1:202955580-202976393 LOC100506747 1 1.000000
8033 chr13:111766159-111768025 ARHGEF7-AS2 1 1.000000
7926 chr13:85937738-86118797 LINC00351 1 1.000000
26185 chr7:33019086-33046543 FKBP9 1 1.000000
7701 chr13:39106137-39260812 LINC00437 1 1.000000
2418 chr1:226335704-226342678 ACBD3-AS1 1 1.000000
7465 chr12:125396191-125399587 UBC 1 1.000000
29638 chrUn_gl000211:48503-93165 FLJ43315 1 1.000000
18818 chr3:99979661-100044096 TBC1D23 1 1.000000
17116 chr20:45947246-45949498 LOC100131496 1 1.000000
8943 chr15:45406523-45410301 DUOXA2 1 1.000000
8994 chr15:55647421-55700708 CCPG1 1 1.000000
31193 chrX:154718673-154842622 TMLHE 1 1.000000
1998 chr1:171308035-171310463 TOP1P1 1 1.000000
11237 chr17:39845127-39847898 EIF1 1 1.000000
11141 chr17:37884753-37886816 MIEN1 1 1.000000
23973 chr6:97537843-97862283 MIR548H3 1 1.000000
... ... ... ... ...
3298 chr10:74653314-74692794 OIT3 1 0.805628
9439 chr16:1543352-1560460 TELO2 1 0.805495
3923 chr10:127660757-127661842 FANK1-AS1 1 0.805156
12214 chr18:13825543-13826861 MC5R 1 0.805062
19555 chr3:178960554-178977679 KCNMB3 1 0.804649
19553 chr3:178960554-178969403 KCNMB3 1 0.804649
19556 chr3:178960554-178984838 KCNMB3 1 0.804649
19554 chr3:178960554-178969645 KCNMB3 1 0.804649
31106 chrX:153167985-153172620 AVPR2 1 0.804173
28043 chr8:67996083-68108849 CSPP1 1 0.803427
6561 chr12:52445186-52453291 NR4A1 1 0.803390
9967 chr16:48278078-48396910 LONP2 1 0.803327
9966 chr16:48278078-48387890 LONP2 1 0.803327
1302 chr1:112141629-112150940 LINC01160 1 0.803124
5193 chr11:71249071-71250253 KRTAP5-8 1 0.802997
14062 chr19:54754269-54761171 LILRB5 1 0.802977
29724 chrX:2746863-2800861 GYG2 1 0.802839
481 chr1:26348271-26362954 EXTL1 1 0.802744
17280 chr20:61038553-61051026 GATA5 1 0.802477
23727 chr6:49753364-49755053 PGK2 1 0.802121
14232 chr19:58637695-58662148 ZNF329 1 0.802068
11540 chr17:48585745-48608862 MYCBPAP 1 0.801869
11563 chr17:49414076-49419932 LINC02072 1 0.801745
8746 chr14:105641825-105647660 NUDT14 1 0.801622
5715 chr11:120811264-120828748 LOC101929208 1 0.801326
13131 chr19:17634110-17662835 FAM129C 1 0.800995
13132 chr19:17634110-17664648 FAM129C 1 0.800995
13117 chr19:17402940-17414282 ABHD8 1 0.800772
8872 chr15:40380091-40401085 BMF 1 0.800453
11126 chr17:37333389-37353956 CACNB1 1 0.800239

7806 rows × 4 columns

In [58]:
gene_pred_df[(gene_pred_df['Predict'] == 0) & (gene_pred_df['Prob'] > 0.4)]
Out[58]:
Gene Gene_Symbol Predict Prob
3545 chr10:96796529-96829254 CYP2C8 0 0.499991
17966 chr3:10353261-10362872 SEC13 0 0.499689
30048 chrX:48555124-48567406 SUV39H1 0 0.499661
28549 chr8:144329098-144344875 ZFP41 0 0.499642
23474 chr6:35704809-35716690 ARMC12 0 0.499311
28315 chr8:104384661-104395232 CTHRC1 0 0.499196
14516 chr2:27309611-27323619 KHK 0 0.499085
10749 chr17:15138536-15168690 PMP22 0 0.498806
5699 chr11:119531703-119599435 NECTIN1 0 0.498654
23052 chr6:26251879-26252303 HIST1H2BH 0 0.498464
27321 chr7:150712239-150721586 ATG9B 0 0.497740
26769 chr7:99775366-99812010 STAG3 0 0.497024
11333 chr17:41878167-41910562 MPP3 0 0.497001
11332 chr17:41878167-41910547 MPP3 0 0.497001
26100 chr7:27179983-27195547 HOXA-AS3 0 0.495020
21490 chr5:43121563-43176426 ZNF131 0 0.494749
22092 chr5:131966281-131977482 TH2LCRR 0 0.494303
2118 chr1:192127592-192154945 RGS18 0 0.494172
5207 chr11:71709958-71713850 IL18BP 0 0.493687
5206 chr11:71709958-71713574 IL18BP 0 0.493687
2138 chr1:196946667-196978803 CFHR5 0 0.493363
6493 chr12:50135293-50158717 TMBIM6 0 0.493251
5957 chr12:4829752-4881892 GALNT8 0 0.493137
26454 chr7:72440165-72443674 LOC541473 0 0.492875
22342 chr5:140855569-140892544 PCDHGC3 0 0.492552
22341 chr5:140855569-140858362 PCDHGC3 0 0.492552
18735 chr3:73672719-73677050 PDZRN3-AS1 0 0.492472
7934 chr13:90712501-90771971 LINC00559 0 0.492298
15090 chr2:86441120-86565206 REEP1 0 0.491801
15089 chr2:86441120-86564777 REEP1 0 0.491801
... ... ... ... ...
4875 chr11:62283374-62314332 AHNAK 0 0.406356
1136 chr1:93775666-93811368 CCDC18-AS1 0 0.406296
28094 chr8:75736772-75767279 PI15 0 0.406261
14208 chr19:58180303-58190520 ZSCAN4 0 0.406165
10634 chr17:7476024-7482324 EIF4A1 0 0.406023
14802 chr2:55459814-55462989 RPS27A 0 0.405657
18122 chr3:26666158-26752265 LRRC3B 0 0.405512
1964 chr1:168369427-168391894 LOC100505918 0 0.405417
8064 chr13:113863008-113919392 CUL4A 0 0.405372
9787 chr16:28875314-28885534 SH2B1 0 0.405330
15499 chr2:131095506-131099956 CCDC115 0 0.404921
15498 chr2:131095506-131099681 CCDC115 0 0.404921
15500 chr2:131095506-131100254 CCDC115 0 0.404921
2922 chr10:30900708-30918647 LYZL2 0 0.404789
4728 chr11:56615954-56645554 LOC101927120 0 0.404415
1593 chr1:152730506-152734529 KPRP 0 0.404212
27564 chr8:11985367-11986806 LOC392196 0 0.403759
31275 chrY:24636544-24660784 PRY 0 0.403243
31276 chrY:24636544-24660784 PRY 0 0.403243
8576 chr14:88399358-88460009 GALC 0 0.403188
18556 chr3:52444524-52457657 PHF7 0 0.402771
1774 chr1:156691683-156698231 ISG20L2 0 0.402407
6709 chr12:55794313-55795251 OR6C65 0 0.401837
26626 chr7:87505544-87538856 DBF4 0 0.401706
551 chr1:31342313-31381480 SDC3 0 0.401242
15538 chr2:132741359-132795473 LINC01945 0 0.401008
7591 chr13:24553765-24881212 SPATA13 0 0.400613
7590 chr13:24553765-24609166 SPATA13 0 0.400613
13369 chr19:36347952-36358048 KIRREL2 0 0.400385
1051 chr1:84609931-84671122 PRKACB 0 0.400176

424 rows × 4 columns

In [59]:
gene_pred_df.to_csv('Neanderthal_Genes_LSTM.txt', index = False, sep = '\t')

Visualization of K-mer Word2Vec Embeddings

Here we will try to use the pre-trained Google's Word2Vec word embeddings algorithm to visualize the dictionary of the Neanderthal introgressed and depleted texts. We will again start with reading the two fasta-files and concerting the sequences to sentences for later usage with Gensim Python library for Word2Vec:

In [1]:
import os
from Bio import SeqIO
from Bio.Seq import Seq

os.chdir('/home/nikolay/Documents/Medium/DeepLearningNeanderthalIntrogression/NeandIntr/')

intr_file = 'hg19_intr_clean.fa'
depl_file = 'hg19_depl_clean.fa'

e = 0
intr_seqs = []
depl_seqs = []
for intr, depl in zip(SeqIO.parse(intr_file, 'fasta'), SeqIO.parse(depl_file, 'fasta')):
    
    cutoff = 500
    my_intr_seq = str(intr.seq)[0:cutoff]
    my_depl_seq = str(depl.seq)[0:cutoff]
    
    intr_seqs.append(my_intr_seq)    
    depl_seqs.append(my_depl_seq)

    e = e + 1
    if e%10000 == 0:
        print('Finished ' + str(e) + ' entries')
Finished 10000 entries
Finished 20000 entries
Finished 30000 entries
Finished 40000 entries
Finished 50000 entries
Finished 60000 entries
Finished 70000 entries
In [2]:
sequences = intr_seqs + depl_seqs
len(sequences)
Out[2]:
147468
In [3]:
def getKmers(sequence, size):
    return [sequence[x:x+size].upper() for x in range(len(sequence) - size + 1)]
In [4]:
print('Building Neanderthal introgressed sequences')
intr_sentences = []
for i in range(len(intr_seqs)):
    intr_sentences.append(getKmers(intr_seqs[i], 5))

print('Building Neanderthal depleted sequences')
depl_sentences = []
for i in range(len(depl_seqs)):
    depl_sentences.append(getKmers(depl_seqs[i], 5))

print('Building merged Neanderthal introgressed and depleted sequences')
sentences = []
for i in range(len(sequences)):
    sentences.append(getKmers(sequences[i], 5))
Building Neanderthal introgressed sequences
Building Neanderthal depleted sequences
Building merged Neanderthal introgressed and depleted sequences
In [13]:
import warnings
warnings.filterwarnings('ignore')

from gensim.models import Word2Vec
model = Word2Vec(sentences, min_count = 2, workers = 4)
print(model)
Word2Vec(vocab=1024, size=100, alpha=0.025)
In [14]:
X = model[model.wv.vocab]
X.shape
Out[14]:
(1024, 100)

Now each word is one observation, this observation has 100 coordinates, i.e. the default number of latent variables for word2vec. Next we can try to use the constructed word vectors and visualize the k-mers space using PCA, tSNE and UMAP. We will highlight most predictive k-mers from the Random Forest classification by green color.

In [16]:
import matplotlib.pyplot as plt
from sklearn.decomposition import PCA
X = model[model.wv.vocab]
pca = PCA(n_components = 2)
result = pca.fit_transform(X)

plt.figure(figsize = (20,18))
plt.scatter(result[:, 0], result[:, 1], s = 10, cmap = 'tab10')
plt.title('Principal Components Analysis (PCA): All K-mers', fontsize = 20)
plt.xlabel("PC1", fontsize = 20)
plt.ylabel("PC2", fontsize = 20)
words = list(model.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(result[i, 0], result[i, 1], word, fontsize = 10, c = 'red')
plt.show()
In [49]:
from umap import UMAP
import matplotlib.pyplot as plt
X = model[model.wv.vocab]
X_reduced = PCA(n_components = 5).fit_transform(X)
umap_model = UMAP(n_neighbors = 30, min_dist = 0.2, n_components = 2)
umap = umap_model.fit_transform(X_reduced)
plt.figure(figsize=(20,15))
plt.scatter(umap[:, 0], umap[:, 1], s = 10, cmap = 'tab10')
plt.title('UMAP: All K-mers', fontsize = 20)
plt.xlabel("UMAP1", fontsize = 20)
plt.ylabel("UMAP2", fontsize = 20)
words = list(model.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 10, c = 'red')
plt.show()
In [53]:
from sklearn.manifold import TSNE

plt.figure(figsize=(20, 15))
X_reduced = PCA(n_components = 5).fit_transform(X)
tsne_model = TSNE(learning_rate = 500, n_components = 2, random_state = 123, perplexity = 30)
tsne = tsne_model.fit_transform(X_reduced)
plt.scatter(tsne[:, 0], tsne[:, 1], cmap = 'tab10', s = 10)
plt.title('tSNE on PCA: All K-mers', fontsize = 20)
plt.xlabel("tSNE1", fontsize = 20)
plt.ylabel("tSNE2", fontsize = 20)
words = list(model.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 10, c = 'red')
plt.show()

Now let us compare the Word2Vec K-mer embeddings for Neanderthal introgressed vs. depleted texts. We start with the introgressed regions:

In [20]:
import warnings
warnings.filterwarnings('ignore')

from gensim.models import Word2Vec
model_intr = Word2Vec(intr_sentences, min_count = 2, workers = 4)
print(model_intr)
Word2Vec(vocab=1024, size=100, alpha=0.025)
In [29]:
import matplotlib.pyplot as plt
from sklearn.decomposition import PCA
X_intr = model_intr[model_intr.wv.vocab]
pca = PCA(n_components = 2)
result = pca.fit_transform(X_intr)

plt.figure(figsize = (20,18))
plt.scatter(result[:, 0], result[:, 1], s = 10, cmap = 'tab10')
plt.title('Principal Components Analysis (PCA): Neanderthal introgressed K-mers', fontsize = 20)
plt.xlabel("PC1", fontsize = 20)
plt.ylabel("PC2", fontsize = 20)
words = list(model_intr.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(result[i, 0], result[i, 1], word, fontsize = 10, c = 'red')
plt.show()
In [30]:
from umap import UMAP
import matplotlib.pyplot as plt
X_intr = model_intr[model_intr.wv.vocab]
X_reduced = PCA(n_components = 5).fit_transform(X_intr)
umap_model = UMAP(n_neighbors = 30, min_dist = 0.1, n_components = 2)
umap = umap_model.fit_transform(X_reduced)
plt.figure(figsize=(20,15))
plt.scatter(umap[:, 0], umap[:, 1], s = 10, cmap = 'tab10')
plt.title('UMAP: Neanderthal introgressed K-mers', fontsize = 20)
plt.xlabel("UMAP1", fontsize = 20)
plt.ylabel("UMAP2", fontsize = 20)
words = list(model_intr.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 10, c = 'red')
plt.show()
In [31]:
from sklearn.manifold import TSNE

plt.figure(figsize=(20, 15))
X_reduced = PCA(n_components = 5).fit_transform(X_intr)
tsne_model = TSNE(learning_rate = 500, n_components = 2, random_state = 123, perplexity = 30)
tsne = tsne_model.fit_transform(X_reduced)
plt.scatter(tsne[:, 0], tsne[:, 1], cmap = 'tab10', s = 10)
plt.title('tSNE on PCA: Neanderthal introgressed K-mers', fontsize = 20)
plt.xlabel("tSNE1", fontsize = 20)
plt.ylabel("tSNE2", fontsize = 20)
words = list(model_intr.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 10, c = 'red')
plt.show()

And now we will visualize the word embeddings vocabulary for the depleted Neanderthal regions:

In [25]:
import warnings
warnings.filterwarnings('ignore')

from gensim.models import Word2Vec
model_depl = Word2Vec(depl_sentences, min_count = 2, workers = 4)
print(model_depl)
Word2Vec(vocab=1024, size=100, alpha=0.025)
In [32]:
import matplotlib.pyplot as plt
from sklearn.decomposition import PCA
X_depl = model_depl[model_depl.wv.vocab]
pca = PCA(n_components = 2)
result = pca.fit_transform(X_depl)

plt.figure(figsize = (20,18))
plt.scatter(result[:, 0], result[:, 1], s = 10, cmap = 'tab10')
plt.title('Principal Components Analysis (PCA): Neanderthal depleted K-mers', fontsize = 20)
plt.xlabel("PC1", fontsize = 20)
plt.ylabel("PC2", fontsize = 20)
words = list(model_depl.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(result[i, 0], result[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(result[i, 0], result[i, 1], word, fontsize = 10, c = 'red')
plt.show()
In [33]:
from umap import UMAP
import matplotlib.pyplot as plt
X_depl = model_depl[model_depl.wv.vocab]
X_reduced = PCA(n_components = 5).fit_transform(X_depl)
umap_model = UMAP(n_neighbors = 30, min_dist = 0.1, n_components = 2)
umap = umap_model.fit_transform(X_reduced)
plt.figure(figsize=(20,15))
plt.scatter(umap[:, 0], umap[:, 1], s = 10, cmap = 'tab10')
plt.title('UMAP: Neanderthal introgressed K-mers', fontsize = 20)
plt.xlabel("UMAP1", fontsize = 20)
plt.ylabel("UMAP2", fontsize = 20)
words = list(model_depl.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(umap[i, 0], umap[i, 1], word, fontsize = 10, c = 'red')
plt.show()
In [28]:
from sklearn.manifold import TSNE

plt.figure(figsize=(20, 15))
X_reduced = PCA(n_components = 5).fit_transform(X_depl)
tsne_model = TSNE(learning_rate = 500, n_components = 2, random_state = 123, perplexity = 30)
tsne = tsne_model.fit_transform(X_reduced)
plt.scatter(tsne[:, 0], tsne[:, 1], cmap = 'tab10', s = 10)
plt.title('tSNE on PCA: Neanderthal depleted K-mers', fontsize = 20)
plt.xlabel("tSNE1", fontsize = 20)
plt.ylabel("tSNE2", fontsize = 20)
words = list(model_depl.wv.vocab)
for i, word in enumerate(words):
    if word == 'AAAAA':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CAAAA':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'CATTT':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    elif word == 'TTTTT':
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 30, c = 'green')
    else:
        plt.text(tsne[i, 0], tsne[i, 1], word, fontsize = 10, c = 'red')
plt.show()

And now for comparison let us put Neanderthal introgressed and depleted k-mers on the same PCA plot coloring introgressed k-mers by red and depleted k-mers by blue:

In [44]:
import matplotlib.pyplot as plt
from sklearn.decomposition import PCA

X_intr = model_intr[model_intr.wv.vocab]
pca_intr = PCA(n_components = 2)
result_intr = pca_intr.fit_transform(X_intr)

X_depl = model_depl[model_depl.wv.vocab]
pca_depl = PCA(n_components = 2)
result_depl = pca_depl.fit_transform(X_depl)

plt.figure(figsize = (20,18))
plt.scatter(result_intr[:, 0], result_intr[:, 1], s = 10, cmap = 'tab10')
plt.scatter(result_depl[:, 0], result_depl[:, 1], s = 10, cmap = 'tab10')
plt.title('Principal Components Analysis (PCA): Neanderthal introgressed K-mers', fontsize = 20)
plt.xlabel("PC1", fontsize = 20)
plt.ylabel("PC2", fontsize = 20)
words_intr = list(model_intr.wv.vocab)
words_depl = list(model_depl.wv.vocab)
for i_intr, word_intr in enumerate(words_intr):
    plt.text(result_intr[i_intr, 0], result_intr[i_intr, 1], word_intr, fontsize = 5, c = 'red')
for i_depl, word_depl in enumerate(words_depl):
    plt.text(result_depl[i_depl, 0], result_depl[i_depl, 1], word_depl, fontsize = 5, c = 'blue')
plt.show()

The introgressed and depleted k-mers seem to be overlapping although it is probably a good idea to merge the X_intr and X_depl and perform PCA on the merged data set.

In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]: